Skip to content

Commit

Permalink
initial commit
Browse files Browse the repository at this point in the history
  • Loading branch information
pumpikano committed Dec 22, 2015
0 parents commit 1f6c1cd
Show file tree
Hide file tree
Showing 33 changed files with 3,585 additions and 0 deletions.
59 changes: 59 additions & 0 deletions .gitignore
Original file line number Diff line number Diff line change
@@ -0,0 +1,59 @@
# Byte-compiled / optimized / DLL files
__pycache__/
*.py[cod]

# C extensions
*.so

# Distribution / packaging
.Python
env/
build/
develop-eggs/
dist/
downloads/
eggs/
.eggs/
lib/
lib64/
parts/
sdist/
var/
*.egg-info/
.installed.cfg
*.egg

# PyInstaller
# Usually these files are written by a python script from a template
# before PyInstaller builds the exe, so as to inject date/other infos into it.
*.manifest
*.spec

# Installer logs
pip-log.txt
pip-delete-this-directory.txt

# Unit test / coverage reports
htmlcov/
.tox/
.coverage
.coverage.*
.cache
nosetests.xml
coverage.xml
*,cover

# Translations
*.mo
*.pot

# Django stuff:
*.log

# Sphinx documentation
docs/_build/

# PyBuilder
target/

.DS_Store
124 changes: 124 additions & 0 deletions LICENSE
Original file line number Diff line number Diff line change
@@ -0,0 +1,124 @@
Apache License
Version 2.0, January 2004

http://www.apache.org/licenses/

TERMS AND CONDITIONS FOR USE, REPRODUCTION, AND DISTRIBUTION

1. Definitions.
"License" shall mean the terms and conditions for use, reproduction, and distribution as defined by
Sections 1 through 9 of this document.

"Licensor" shall mean the copyright owner or entity authorized by the copyright owner that is granting
the License.

"Legal Entity" shall mean the union of the acting entity and all other entities that control, are
controlled by, or are under common control with that entity. For the purposes of this definition,
"control" means (i) the power, direct or indirect, to cause the direction or management of such entity,
whether by contract or otherwise, or (ii) ownership of fifty percent (50%) or more of the outstanding
shares, or (iii) beneficial ownership of such entity.

"You" (or "Your") shall mean an individual or Legal Entity exercising permissions granted by this License.

"Source" form shall mean the preferred form for making modifications, including but not limited to software
source code, documentation source, and configuration files.

"Object" form shall mean any form resulting from mechanical transformation or translation of a Source form,
including but not limited to compiled object code, generated documentation, and conversions to other media types.

"Work" shall mean the work of authorship, whether in Source or Object form, made available under the License,
as indicated by a copyright notice that is included in or attached to the work (an example is provided in
the Appendix below).

"Derivative Works" shall mean any work, whether in Source or Object form, that is based on (or derived from)
the Work and for which the editorial revisions, annotations, elaborations, or other modifications represent,
as a whole, an original work of authorship. For the purposes of this License, Derivative Works shall not
include works that remain separable from, or merely link (or bind by name) to the interfaces of, the Work
and Derivative Works thereof.

"Contribution" shall mean any work of authorship, including the original version of the Work and any
modifications or additions to that Work or Derivative Works thereof, that is intentionally submitted to
Licensor for inclusion in the Work by the copyright owner or by an individual or Legal Entity authorized to
submit on behalf of the copyright owner. For the purposes of this definition, "submitted" means any form of
electronic, verbal, or written communication sent to the Licensor or its representatives, including but not
limited to communication on electronic mailing lists, source code control systems, and issue tracking systems
that are managed by, or on behalf of, the Licensor for the purpose of discussing and improving the Work, but
excluding communication that is conspicuously marked or otherwise designated in writing by the copyright owner
as "Not a Contribution."

"Contributor" shall mean Licensor and any individual or Legal Entity on behalf of whom a Contribution has been
received by Licensor and subsequently incorporated within the Work.

2. Grant of Copyright License. Subject to the terms and conditions of this License, each Contributor hereby
grants to You a perpetual, worldwide, non-exclusive, no-charge, royalty-free, irrevocable copyright license
to reproduce, prepare Derivative Works of, publicly display, publicly perform, sublicense, and distribute
the Work and such Derivative Works in Source or Object form.

3. Grant of Patent License. Subject to the terms and conditions of this License, each Contributor hereby
grants to You a perpetual, worldwide, non-exclusive, no-charge, royalty-free, irrevocable (except as stated
in this section) patent license to make, have made, use, offer to sell, sell, import, and otherwise transfer
the Work, where such license applies only to those patent claims licensable by such Contributor that are
necessarily infringed by their Contribution(s) alone or by combination of their Contribution(s) with the Work
to which such Contribution(s) was submitted. If You institute patent litigation against any entity (including
a cross-claim or counterclaim in a lawsuit) alleging that the Work or a Contribution incorporated within the
Work constitutes direct or contributory patent infringement, then any patent licenses granted to You under
this License for that Work shall terminate as of the date such litigation is filed.

4. Redistribution. You may reproduce and distribute copies of the Work or Derivative Works thereof in any
medium, with or without modifications, and in Source or Object form, provided that You meet the following
conditions:

You must give any other recipients of the Work or Derivative Works a copy of this License; and
You must cause any modified files to carry prominent notices stating that You changed the files; and
You must retain, in the Source form of any Derivative Works that You distribute, all copyright, patent,
trademark, and attribution notices from the Source form of the Work, excluding those notices that do not
pertain to any part of the Derivative Works; and
If the Work includes a "NOTICE" text file as part of its distribution, then any Derivative Works that You
distribute must include a readable copy of the attribution notices contained within such NOTICE file,
excluding those notices that do not pertain to any part of the Derivative Works, in at least one of the
following places: within a NOTICE text file distributed as part of the Derivative Works; within the Source
form or documentation, if provided along with the Derivative Works; or, within a display generated by the
Derivative Works, if and wherever such third-party notices normally appear. The contents of the NOTICE file
are for informational purposes only and do not modify the License. You may add Your own attribution notices
within Derivative Works that You distribute, alongside or as an addendum to the NOTICE text from the Work,
provided that such additional attribution notices cannot be construed as modifying the License.

You may add Your own copyright statement to Your modifications and may provide additional or different
license terms and conditions for use, reproduction, or distribution of Your modifications, or for any such
Derivative Works as a whole, provided Your use, reproduction, and distribution of the Work otherwise
complies with the conditions stated in this License.

5. Submission of Contributions. Unless You explicitly state otherwise, any Contribution intentionally
submitted for inclusion in the Work by You to the Licensor shall be under the terms and conditions of this
License, without any additional terms or conditions. Notwithstanding the above, nothing herein shall
supersede or modify the terms of any separate license agreement you may have executed with Licensor regarding
such Contributions.

6. Trademarks. This License does not grant permission to use the trade names, trademarks, service marks, or
product names of the Licensor, except as required for reasonable and customary use in describing the origin
of the Work and reproducing the content of the NOTICE file.

7. Disclaimer of Warranty. Unless required by applicable law or agreed to in writing, Licensor provides the
Work (and each Contributor provides its Contributions) on an "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS
OF ANY KIND, either express or implied, including, without limitation, any warranties or conditions of TITLE,
NON-INFRINGEMENT, MERCHANTABILITY, or FITNESS FOR A PARTICULAR PURPOSE. You are solely responsible for
determining the appropriateness of using or redistributing the Work and assume any risks associated with Your
exercise of permissions under this License.

8. Limitation of Liability. In no event and under no legal theory, whether in tort (including negligence),
contract, or otherwise, unless required by applicable law (such as deliberate and grossly negligent acts)
or agreed to in writing, shall any Contributor be liable to You for damages, including any direct, indirect,
special, incidental, or consequential damages of any character arising as a result of this License or out
of the use or inability to use the Work (including but not limited to damages for loss of goodwill, work
stoppage, computer failure or malfunction, or any and all other commercial damages or losses), even if such
Contributor has been advised of the possibility of such damages.

9. Accepting Warranty or Additional Liability. While redistributing the Work or Derivative Works thereof,
You may choose to offer, and charge a fee for, acceptance of support, warranty, indemnity, or other
liability obligations and/or rights consistent with this License. However, in accepting such obligations,
You may act only on Your own behalf and on Your sole responsibility, not on behalf of any other Contributor,
and only if You agree to indemnify, defend, and hold each Contributor harmless for any liability incurred
by, or claims asserted against, such Contributor by reason of your accepting any such warranty or additional
liability.

END OF TERMS AND CONDITIONS
62 changes: 62 additions & 0 deletions README.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,62 @@
# Locally Optimized Product Quantization

This is Python training and testing code for Locally Optimized Product Quantization (LOPQ) models, as well as Spark scripts to scale training to hundreds of millions of vectors. The resulting model can be used in Python with code provided here or deployed via a Protobuf format to, e.g., search backends for high performance approximate nearest neighbor search.

### Overview

Locally Optimized Product Quantization (LOPQ) [1] is a hierarchical quantization algorithm that produces codes of configurable length for data points. These codes are efficient representations of the original vector and can be used in a variety of ways depending on application, including as hashes that preserve locality, as a compressed vector from which an approximate vector in the data space can be reconstructed, and as a representation from which to compute an approximation of the Euclidean distance between points.

Conceptually, the LOPQ quantization process can be broken into 4 phases. The training process also fits these phases to the data in the same order.

1. The raw data vector is PCA'd to `D` dimensions (possibly the original dimensionality). This allows subsequent quantization to more efficiently represent the variation present in the data.
2. The PCA'd data is then product quantized [2] by two k-means quantizers. This means that each vector is split into two subvectors each of dimension `D / 2`, and each of the two subspaces is quantized independently with a vocabulary of size `V`. Since the two quantizations occur independently, the dimensions of the vectors are permuted such that the total variance in each of the two subspaces is approximately equal, which allows the two vocabularies to be equally important in terms of capturing the total variance of the data. This results in a pair of cluster ids that we refer to as "coarse codes".
3. The residuals of the data after coarse quantization are computed. The residuals are then locally projected independently for each coarse cluster. This projection is another application of PCA and dimension permutation on the residuals and it is "local" in the sense that there is a different projection for each cluster in each of the two coarse vocabularies. These local rotations make the next and final step, another application of product quantization, very efficient in capturing the variance of the residuals.
4. The locally projected data is then product quantized a final time by `M` subquantizers, resulting in `M` "fine codes". Usually the vocabulary for each of these subquantizers will be a power of 2 for effective storage in a search index. With vocabularies of size 256, the fine codes for each indexed vector will require `M` bytes to store in the index.

The final LOPQ code for a vector is a `(coarse codes, fine codes)` pair, e.g. `((3, 2), (14, 164, 83, 49, 185, 29, 196, 250))`.

### Nearest Neighbor Search

A nearest neighbor index can be built from these LOPQ codes by indexing each document into its corresponding coarse code bucket. That is, each pair of coarse codes (which we refer to as a "cell") will index a bucket of the vectors quantizing to that cell.

At query time, an incoming query vector undergoes substantially the same process. First, the query is split into coarse subvectors and the distance to each coarse centroid is computed. These distances can be used to efficiently compute a priority-ordered sequence of cells [3] such that cells later in the sequence are less likely to have near neighbors of the query than earlier cells. The items in cell buckets are retrieved in this order until some desired quota has been met.

After this retrieval phase, the fine codes are used to rank by approximate Euclidean distance. The query is projected into each local space and the distance to each indexed item is estimated as the sum of the squared distances of the query subvectors to the corresponding subquantizer centroid indexed by the fine codes.

NN search with LOPQ is highly scalable and has excellent properties in terms of both index storage requirements and query-time latencies when implemented well.

#### References

For more information and performance benchmarks can be found at http://image.ntua.gr/iva/research/lopq/.

1. Y. Kalantidis, Y. Avrithis. [Locally Optimized Product Quantization for Approximate Nearest Neighbor Search.](http://image.ntua.gr/iva/files/lopq.pdf) CVPR 2014.
2. H. Jegou, M. Douze, and C. Schmid. [Product quantization for nearest neighbor search.](https://lear.inrialpes.fr/pubs/2011/JDS11/jegou_searching_with_quantization.pdf) PAMI, 33(1), 2011.
3. A. Babenko and V. Lempitsky. [The inverted multi-index.](http://www.computer.org/csdl/trans/tp/preprint/06915715.pdf) CVPR 2012.

### Python

Full LOPQ training and evaluation in implemented in the `lopq` python module. Please refer to the README in `python/` for more detail.

### Spark

The training algorithm is also implemented on Spark using `pyspark` to scale parameter fitting to large datasets. Please refer to the README in `spark/` for documentation and usage information.

#### Running Tests

Tests can be run during development by running:

```bash
cd python/
bash test.sh
```

To run tests in a virtual environment this project uses [tox](http://tox.testrun.org/). Tox can be installed with `pip install tox` and run from the `python/` directory:

```bash
cd python/
tox
```

#### License

Code licensed under the Apache License, Version 2.0 license. See LICENSE file for terms.
5 changes: 5 additions & 0 deletions data/oxford/README.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,5 @@
### Oxford dataset

This test dataset is included for tests and the short `example.py` tutorial. The `oxford_features.fvecs` files contains 5063 128-dimensional vectors representing each of the images in the Oxford5k dataset (http://www.robots.ox.ac.uk/~vgg/data/oxbuildings/). The data format is that described here: http://corpus-texmex.irisa.fr/.

The features themselves are the rectified fc7 layer outputs of the BVLC Caffe reference model (https://github.com/BVLC/caffe/tree/master/models/bvlc_reference_caffenet) extracted from resized images and then PCA'd to 128 dimensions from their original 4096.
Binary file added data/oxford/oxford_features.fvecs
Binary file not shown.
32 changes: 32 additions & 0 deletions protobuf/README.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,32 @@
# Protocol buffer schema

The `.proto` file defines the data schema for LOPQ model parameters.
It can be compiled into libraries for any target languages
that will use this data (i.e., Java and Python). A compiled version
for Python is included in the `lopq` module.

See: [https://developers.google.com/protocol-buffers/docs/overview](https://developers.google.com/protocol-buffers/docs/overview)
for details on protocol buffers, and how to update the schema.
Note, to keep the schema backwards compatable, it is important not to
alter the `tags` (ints) assigned to each field in `.proto` file.


### Compiling `.proto` file

Compile for Java:

```bash
# from the repository root
protoc -I=protobuf \
--java_out=. \
protobuf/lopq_model.proto
```

Compile for Python:

```bash
# from the repository root
protoc -I=protobuf \
--python_out=python/lopq \
protobuf/lopq_model.proto
```
41 changes: 41 additions & 0 deletions protobuf/lopq_model.proto
Original file line number Diff line number Diff line change
@@ -0,0 +1,41 @@
// Copyright 2015, Yahoo Inc.
// Licensed under the terms of the Apache License, Version 2.0. See the LICENSE file associated with the project for terms.

// define a namespace for protobuffer
package com.flickr.vision.lopq;

// options for how java class should be built and packaged
option java_package = "com.flickr.vision.lopq";
option java_outer_classname = "LOPQModelParameters";
option optimize_for = SPEED;


/////////////////////////////////////////////////////////////////
// GENERIC TYPES

message Vector {
repeated float values = 1 [packed = true];
}

message Matrix {
repeated float values = 1 [packed = true];
repeated uint32 shape = 2;
}

/////////////////////////////////////////////////////////////////
// LOPQ PARAMS

// lopq model params
// file extension: .lopq
message LOPQModelParams {

optional uint32 D = 1; // dimensionality of original vectors
optional uint32 V = 2; // number of coarse quantizer centroids
optional uint32 M = 3; // number of subvectors
optional uint32 num_subquantizers = 4; // number of subquantizer clusters

repeated Matrix Cs = 5; // coarse quantizer centroids - 2 of these; size V x (D/2)
repeated Matrix Rs = 6; // rotations - 2 * V of these, each size D/2 x D/2
repeated Vector mus = 7; // residual means - 2 * V of these, each size D/2
repeated Matrix subs = 8; // subquantizer centroids - M of these, each size num_subquantizers x (D/2))
}
Loading

0 comments on commit 1f6c1cd

Please sign in to comment.