Installation#
Using the install script#
Opaque SQL has an install script (opaque-sql/scripts/install.sh
) to take care of installing/downloading everything for Opaque SQL to work immediately. To use this, first clone/fork the repository, then run
cd opaque-sql
source ./scripts/install.sh
Note that the source
is necessary to properly set environment variables Opaque SQL needs. If alternatively you don’t want to use the install script, see the section on installing dependencies manually below. For testing, skip to running tests.
Installing dependencies manually#
After downloading the Opaque codebase, install necessary dependencies as follows.
Install dependencies and the OpenEnclave SDK. We currently support OE version 0.17.1 (so please install with
open-enclave=0.17.1
) and Ubuntu 18.04.# For Ubuntu 18.04: sudo apt -y install wget build-essential openjdk-8-jdk python libssl-dev libmbedtls-dev pip3 install grpcio grpcio-tools # Needed for PySpark listener # Install a newer version of CMake (3.15) wget https://github.com/Kitware/CMake/releases/download/v3.15.6/cmake-3.15.6-Linux-x86_64.sh sudo bash cmake-3.15.6-Linux-x86_64.sh --skip-license --prefix=/usr/local rm cmake-3.15.6-Linux-x86_64.sh # Install Spark 3.1.1 (if not already done) wget https://archive.apache.org/dist/spark/spark-3.1.1/spark-3.1.1-bin-hadoop2.7.tgz tar xvf spark-3.1.1* sudo mkdir /opt/spark sudo mv spark-3.1.1*/* /opt/spark rm -rf spark-3.1.1* sudo mkdir /opt/spark/work sudo chmod -R a+wx /opt/spark/work
Change into the Opaque root directory and edit Opaque’s environment variables in
opaqueenv
(including Spark configurations) if desired. Export Opaque SQL and Open Enclave environment variables viasource opaqueenv source /opt/openenclave/share/openenclave/openenclaverc
By default, Opaque runs in hardware mode (environment variable
MODE=HARDWARE
). If you do not have a machine with a hardware enclave but still wish to test out Opaque’s functionality locally, then setexport MODE=SIMULATE
.
Running tests#
To run the Opaque tests:
build/sbt test
Alternatively, to run tests and generate coverage reports:
build/sbt clean coverage test build/sbt coverageReport
Additional configurations for running on a Spark cluster#
Opaque SQL needs three Spark properties to be set:
spark.executor.instances=n
(n is usually the number of machines in the cluster)spark.task.maxFailures=10
(attestation uses Sparks fault tolerance property)spark.driver.defaultJavaOptions="-Dscala.color"
(if querying with MC2 Client)
These properties can be be set in a custom configuration file, the default being located at ${SPARK_HOME}/conf/spark-defaults.conf
, or as a spark-submit
or spark-shell
argument: --conf <key>=<value>
. For more details on running a Spark cluster, see the Spark documentation