Skip to content

Commit

Permalink
Add new tutorial: quantum classification (microsoft#460)
Browse files Browse the repository at this point in the history
This tutorial introduces circuit-centric classifiers featured in https://docs.microsoft.com/quantum/user-guide/libraries/machine-learning/. It covers the high-level overview of the training/classification workflow and a deep dive into training and classification steps for a simple classification task.
  • Loading branch information
tcNickolas authored Aug 17, 2020
1 parent f64eea0 commit 83c1f0f
Show file tree
Hide file tree
Showing 13 changed files with 1,156 additions and 1 deletion.
1 change: 1 addition & 0 deletions Dockerfile
Original file line number Diff line number Diff line change
Expand Up @@ -15,6 +15,7 @@ USER root
# Install Python dependencies for the Python visualization and tutorial notebooks
RUN pip install -I --no-cache-dir \
matplotlib \
numpy \
pytest && \
# Give permissions to the jovyan user
chown -R ${USER} ${HOME} && \
Expand Down
2 changes: 2 additions & 0 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -59,6 +59,8 @@ Each kata is a separate project that includes:
solution to the Deutsch–Jozsa problem to a classical one.
* **[Exploring Grover's search algorithm](./tutorials/ExploringGroversAlgorithm/)**.
Learn more about Grover's search algorithm, picking up where the [Grover's algorithm kata](./GroversAlgorithm/) left off.
* **[Quantum classification](./tutorials/QuantumClassification/)**.
Learn about circuit-centric classifiers and the quantum machine learning library included in the QDK.

## List of Katas <a name="kata-topics" /> ##

Expand Down
2 changes: 2 additions & 0 deletions index.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -38,6 +38,8 @@
" and compare the quantum solution to the Deutsch–Jozsa problem to a classical one.\n",
"* **[Exploring Grover's search algorithm](./tutorials/ExploringGroversAlgorithm/ExploringGroversAlgorithmTutorial.ipynb)**.\n",
" Learn more about Grover's search algorithm, picking up where the [Grover's algorithm kata](./GroversAlgorithm/GroversAlgorithm.ipynb) left off.\n",
"* **[Quantum classification](./tutorials/QuantumClassification/ExploringQuantumClassificationLibrary.ipynb)**.\n",
" Learn about circuit-centric classifiers and the quantum machine learning library included in the QDK.\n",
"\n",
"## List of Katas\n",
"\n",
Expand Down
2 changes: 1 addition & 1 deletion scripts/steps-init.yml
Original file line number Diff line number Diff line change
Expand Up @@ -15,7 +15,7 @@ steps:
architecture: 'x64'
displayName: 'Use Python 3.6'

- script: pip install setuptools wheel pytest jupyter
- script: pip install setuptools wheel pytest jupyter numpy matplotlib qsharp
displayName: 'Install Python tools'

- powershell: ./install-iqsharp.ps1
Expand Down
80 changes: 80 additions & 0 deletions tutorials/QuantumClassification/Backend.qs
Original file line number Diff line number Diff line change
@@ -0,0 +1,80 @@
// Copyright (c) Microsoft Corporation.
// Licensed under the MIT License.

//////////////////////////////////////////////////////////////////////
// This file contains implementations of training and classification routines
// used in part 1 of the tutorial ("Exploring Quantum Classification Library").
// You should not modify anything in this file.
//////////////////////////////////////////////////////////////////////

namespace Microsoft.Quantum.Kata.QuantumClassification {
open Microsoft.Quantum.Convert;
open Microsoft.Quantum.Intrinsic;
open Microsoft.Quantum.Canon;
open Microsoft.Quantum.Arrays;
open Microsoft.Quantum.MachineLearning;
open Microsoft.Quantum.Math;

function DefaultSchedule(samples : Double[][]) : SamplingSchedule {
return SamplingSchedule([
0..Length(samples) - 1
]);
}

// The definition of classifier structure for the case when the data is linearly separable and fits into 1 qubit
function ClassifierStructure() : ControlledRotation[] {
return [
ControlledRotation((0, new Int[0]), PauliY, 0)
];
}


// Entry point for training a model; takes the data as the input and uses hard-coded classifier structure.
operation TrainLinearlySeparableModel(
trainingVectors : Double[][],
trainingLabels : Int[],
initialParameters : Double[][]
) : (Double[], Double) {
// convert training data and labels into a single data structure
let samples = Mapped(
LabeledSample,
Zip(trainingVectors, trainingLabels)
);
let (optimizedModel, nMisses) = TrainSequentialClassifier(
Mapped(
SequentialModel(ClassifierStructure(), _, 0.0),
initialParameters
),
samples,
DefaultTrainingOptions()
w/ LearningRate <- 2.0
w/ Tolerance <- 0.0005,
DefaultSchedule(trainingVectors),
DefaultSchedule(trainingVectors)
);
Message($"Training complete, found optimal parameters: {optimizedModel::Parameters}, {optimizedModel::Bias} with {nMisses} misses");
return (optimizedModel::Parameters, optimizedModel::Bias);
}


// Entry point for using the model to classify the data; takes validation data and model parameters as inputs and uses hard-coded classifier structure.
operation ClassifyLinearlySeparableModel(
samples : Double[][],
parameters : Double[],
bias : Double,
tolerance : Double,
nMeasurements : Int
)
: Int[] {
let model = Default<SequentialModel>()
w/ Structure <- ClassifierStructure()
w/ Parameters <- parameters
w/ Bias <- bias;
let probabilities = EstimateClassificationProbabilities(
tolerance, model,
samples, nMeasurements
);
return InferredLabels(model::Bias, probabilities);
}

}

Large diffs are not rendered by default.

Loading

0 comments on commit 83c1f0f

Please sign in to comment.