This section gathers the most common questions from the community related to packages and usability of this repository.
- What is the policy on recipe name collisions?
- What is the policy on creating packages from pre-compiled binaries?
- Should reference names use
-
or_
? - Why are CMake find/config files and pkg-config files not packaged?
- Should recipes export a recipe's license?
- Why recipes that use build tools (like CMake) that have packages in Conan Center do not use it as a build require by default?
- Are python requires allowed in the
conan-center-index
? - What version should packages use for libraries without official releases?
- Is the Jenkins orchestration library publicly available?
- Why not x86 binaries?
- Why PDB files are not allowed?
- Can I remove an option from a recipe?
- Can I split a project into an installer and library package?
- What license should I use for Public Domain?
- Why is a
tools.check_min_cppstd
call not enough? - What is the policy for adding older versions of a package?
- What is the policy for removing older versions of a package?
- Can I install packages from the system package manager?
- Why ConanCenter does not build and execute tests in recipes
- What is the policy for supported python versions?
- How to package libraries that depend on proprietary closed-source libraries?
- How to protect my project from breaking changes in recipes?
- Why are version ranges not allowed?
- How to consume a graph of shared libraries?
Packages generated by the build service and uploaded to Conan Center follow the structure of <name>/<version>
for the reference. Although the ecosystem of C/C++ open-source libraries is not as big as in other languages there is still a risk of having a name collision for the names of the package.
This repository will try to follow the most well-known names for each of the recipes contributed, paying attention to all the contributions and checking any collision with other popular libraries beforehand. In the case of having to face disambiguation (due to different libraries with the same name), we would look at other sources and look for a consensus.
However, if it is not possible and there is the case of a new recipe producing a name collision, the first recipe contributed will have precedence over it. Generally, recipes contributed to the repo won't change its name in order to not break users.
For example, GSL
is the name of Guidelines Support Library
from Microsoft and GNU Scientific Library
from GNU. Both libraries are commonly known as gsl
, however, to disambiguate (if there is already a gsl
package in this repo) we could use ms-gsl
in the first case or gnu-gsl
in the second.
The policy is that in the general case recipes should build packages from sources, because of reproducibility and security concerns. The implication is that the sources must be publicly available, and in a format that can be consumed programmatically.
Check the link for further details.
Recipes should stick to the original name of a library as much as possible. For example libjpeg-turbo
, expected-lite
and optional-lite
have a -
in their original names.
In the case of spaces in the name, the most common approach is to use _
as done in xz_utils
.
For libraries with a too generic name, like variant
, the name of the organization can be used as prefix separated by a -
, like mpark-variant
, tl-expected
or taocpp-tuple
.
We know that using find_package()
and relying on the CMake behavior to find the dependencies is something that should be avoided in favor of the information provided by the package manager.
Conan has an abstraction over the packages build system and description by using generators. Those generators translate the information of the dependency graph and create a suitable file that can be consumed by your build system.
In the past, we have found that the logic of some of the CMake's find/config or pkg-config files can lead to broken scenarios due to issues with:
- Transitive dependencies: The find logic of CMake can lead to link libraries with system libraries instead of the ones specified in the conanfile.
- Different build type configurations: Usually those files are not prepared to handle multiconfiguration development while switching between release/debug build types for example.
- Absolute paths: Usually, those files include absolute paths that would make the package broken when shared and consumed.
- Hardcoded versions of dependencies as well as build options that make overriding dependencies from the consumer not possible.
We believe that the package manager should be the one responsible to handle this information in order to achieve a deterministic and controlled behavior. Regarding the integration with CMake, Conan already provides ways to consume those packages in the same way by using generators like cmake_find_package* or cmake_find_package_multi and features like components to define internal libraries of a package and generate proper CMake targets or build_modules to package build system utilities like CMake macros.
Defining the package information in the recipe is also useful in order to consume those packages from a different build system, for example using pkg-config with the pkg_config generator.
Finally, by not allowing these files we make packages agnostic to the consumer as the logic of those files is not in the package but in the way the consumer wants the information.
If you really think this is an issue and there is something missing to cover the use case of a library you want to contribute to ConanCenter, please do not hesitate to open an issue and we will be happy to hear your feedback.
* Take a look at the integrations section to learn more: https://docs.conan.io/en/latest/integrations/build_system/cmake/cmake_find_package_generator.html
No, recipes do not need to export a recipe license. Recipes and all files contributed to this repository are licensed under the license in the root of the repository. Using any recipe from this repository or directly from conan-center implies the same licensing.
Why recipes that use build tools (like CMake) that have packages in Conan Center do not use it as a build require by default?
We generally consider tools like CMake as a standard tool to have installed in your system. Having the cmake
package as a build require in all the recipes that use it will be an overkill, as every build requirement is installed like a requirement and takes time to download. However, cmake
could still be useful to use in your profile:
[build_requires]
cmake/3.17.2
Other packages using more unusual build tools, like OpenSSL
using strawberryperl
, will have the build require in the recipe as it is likely that the user that want to build it from sources will not have it installed in their system
Unless they are a general and extended utility in recipes (in which case, we should study its inclusion in the Conan tools module), python requires are not allowed in conan-center-index
repository.
The notation shown below is used for publishing packages where the original library does not make official releases. Thus, we use a format which includes the datestamp corresponding to the date of a commit: cci.<YEAR MONTH DAY>
. In order to create reproducible builds, we also "commit-lock" to the latest commit on that day. Otherwise, users would get inconsistent results over time when rebuilding the package. An example of this is the RapidJSON library, where its package reference is rapidjson/cci.20200410
and its sources are locked the latest commit on that date in config.yml. The prefix cci.
is mandatory to distinguish as a virtual version provided by CCI. If you are interested to know about the origin, please, read here.
Currently, the Jenkins orchestration library for this build service is not available. We believe this solution is too specific for this purpose, as we are massively building binaries for many configurations and the main purpose of a CI system with Conan in an organization should be to rebuild only the need packages. However, we know this could be interesting for organizations in order to learn new approaches for CI flows. We will release this information and CI flow recommendations as soon as possible.
As described in the Supported platforms and configurations, only the x86_64 architecture is available for download, the rest must be built from sources. The reasons behind this decision are:
- Few users need different pre-built packages that are not x86_64 packages, this number is less than 10% of total users (data obtained through the download counter from Bintray), and tends to decrease over the years;
- Some OS are putting the x86 as obsolete, examples macOS and Ubuntu 20.04;
- For security reasons, most companies build their own packages from sources, even if they already have a pre-built version available, which further reduces the need for extra configurations;
- Each recipe results in around 130 packages, and this is only for x86_64, but not all packages are used, some settings remain with zero downloads throughout their life. So, imagine adding more settings that will rarely be used, but that will consume more resources as time and storage, this leaves us in an impractical situation.
As stated earlier, any increase in the number of configurations will result in an impractical scenario. In addition, more validations require more review time for a recipe, which would increase the time for all PRs, delaying the release of a new package. For these reasons, x86 is not validated by the CCI.
We often receive new fixes and improvements to the recipes already available for x86_64, including help for other architectures like x86 and ARM. In addition, we also receive new cases of bugs, for recipes that do not work on a certain platform, but that are necessary for use, which is important to understand where we should put more effort. So we believe that the best way to maintain and add support for other architectures is through the community.
The project initially decided not to support the PDB files primarily due to the size of the final package, which could add an exaggerated size and not even used by users. In addition, PDB files need the source code to perform the debugging and even follow the path in which it was created and not the one used by the user, which makes it difficult to use when compared to the regular development flow with the IDE.
However, there are ways to get around this, one of them is through the /Z7 compilation flag, which can be passed through environment variables. You can use your profile to customize your compiler command line.
Adding one more common option, it seems the most simple and obvious solution, but it contains a side effect already seen with fPIC. It is necessary to manage the entire recipe, it has become a Boilerplate. So, adding PDB would be one more point to be reviewed for each recipe. In addition, in the future new options could arise, such as sanity or benchmark, further inflating the recipes. For this reason, a new option will not be added. However, the inclusion of the PDB files is discussed in issue #1982 and there are some ideas for making this possible through a new feature. If you want to comment on the subject, please visit issue.
It's preferable to leave all options (ie. not removing them) because it may break other packages which require those deleted options. Prefer the deprecation path with a mapping from old options to new ones:
- Add "deprecated" as option value
- Set "deprecated" as default option
- Check the option value, if the value is different from "deprecated", raise a warning
- Remove the option from Package ID
options = {"foobar": [True, False, "deprecated"]}
default_options = {"foobar": "deprecated"}
def configure(self):
if self.options.foobar != "deprecated":
self.output.warn("foobar option is deprecated, do not use anymore.")
def package_id(self):
del self.info.options.foobar
This is the safest way, users will be warned of deprecation and their projects will not risk breaking. As aditional examples, take a look on follow recipes: dcmtk, gtsam and libcurl.
However, if logic is too complex (this is subjective and depends on the Conan review team) then just remove the option. After one month, we will welcome a PR removing the option that was deprecated.
No. Some projects provide more than a simple library, but also applications. For those projects, both libraries and executables should be kept together under the same Conan package. In the past, we tried to separate popular projects, like Protobuf, and it proved to be a complex and hard task to be maintained, requiring custom patches to disable parts of the building. Also, with the context feature, we can use the same package as build requirement, for the same build platform, and as a regular requirement, for the host platform, when cross-building. It's recommended using 2 profiles in that case, one for build platform (where the compilation tools are being executed) and one for host platform (where the generated binaries will run).
The Public Domain is not a license by itselt. Thus, we have equivalent licenses to be used instead. By default, if a project uses Public Domain and there is no offcial license listed, you should use Unlicense.
Very often C++ projects require a minimum standard version, such as 14 or 17, in order to compile. Conan offers tools which enable checking the relevant setting is enabled and above this support for a certain version is present. Otherwise, it uses the compiler's default.
def configure(self):
tools.check_min_cppstd(self, 14) 👈 Wrong!
This fails to cover the vast number of use cases for the following reasons:
cppstd
is not configured in the--detect
ed profiles generated by Conan, the majority of users simply do not have this setting.- A shocking number of projects override this setting within their respective build scripts, this setting does not get applied in those cases.
- Conan-Center-Index does not manage the
cppstd
setting for the compilers it supports to generate binaries.
def validate(self):
# 👇 Correct
if self.settings.compiler.get_safe("cppstd"):
tools.check_min_cppstd(self, 14)
As a result, all calls to tools.check_min_cppstd
must be guarded by a check for the setting and the only way to ensure the C++ standard is to check the compiler's version to know if it offers sufficient support. An example of this can be found here.
We defer adding older versions without a direct requirement. We love to hear why in the opening description of the PR. This is for historical reasons, when older versions were permitted the overwhelming majority received zero downloads and were never used by the community while still increasing the burden on the build system.
Keeping many older versions can be a problem, as over time they may become incompatible with newer versions of the package's Python code and/or dependencies. They also become downloaded less often than newer versions, and yet continue to consume CI resources during Pull Requests.
Given a technical limitations and/or incompatibilities emerging from infrastructure changes, removing older versions from config.yml
and conandata.yml
may be permitted. The respective recipes and binary packages will not be removed from Conan Center, but they will not receive new updates, as they are not listed to be built.
There is no strict rule for keeping older versions, but we recommend keeping only the latest version of each old major release. For the latest major version available, the last patch version of each minor version should be available. As example, we can list the CMake package.
It depends. You can not mix both regular projects with system packages, but you can provide package wrappers for system packages. However, Conan can not track system packages, like their version and options, which creates a fragile situation where affects libraries and binaries built in your package but can not be totally reproduced. Also, system package managers require administrator permission to install packages, which is not always possible and may break limited users. Moreover, more than one Conan package may require the same system package and there is no way to track their mutual usage.
The hook KB-H032 does not allow system_requirement
nor SystemPackageTool
in recipes, to avoid mixing both regular projects with
system packages at same recipe.
There are exceptions where some projects are closer to system drivers or hardware and packaging as a regular library could result in an incompatible Conan package. To deal with those cases, you are allowed to provide an exclusive Conan package which only installs system packages, see the How-to for more.
There are different motivations
- time and resources: adding the build time required by the test suite plus execution time can increase our building times significantly across the 100+ configurations.
- ConanCenter is a service that builds binaries for the community for existing library versions, this is not an integration system to test the libraries.
Python 2.7
and earlier is not supported by the ConanCenter, as it's already EOL.
Python 3.5
and earlier is also not supported by the ConanCenter, as it's already EOL.
Versions Python 3.6+
onwards are currently supported by the infrastructure and the recipes.
Our docker images use Python 3.7.5+
ATM.
Windows agents currently use Python 3.6.7+
. macOS agents use version Python 3.7.3+
.
The version run by our agents and docker images is a subject to change, as security updates to the Python are released, or they enter EOL.
Additional concerns about supported versions within conan ecosystem (not just ConanCenter, but client itself and other tools) are documented in tribe.
For ConanCenter, besides security, there are various concerns about critical features provided by the Python interpreter, include its syntax and the standard library, e.g.:
- LZMA compression support
- Unicode awareness
- long-path awareness
Right now, only the CPython flavor of the interpreter is supported (e.g. we never tested recipes work with IronPython, JPython, Cython, etc.).
In addition, we support only 64-bit builds of the interpreter (amd64/x86_64 architecture) - 32-bit builds (x86) are not supported and not installed on the agents.
There are no guarantees that recipes will work correctly in future Python versions having breaking changes to the interpreter, as we don't test all the possible combinations (and probably will never be). Patches are welcomed if problems are found.
There are several popular software libraries provided by Intel:
- Intel Math Kernel Library (MKL)
- Intel Integrated Performance Primitives (IPP)
- Intel Deep Neural Networking Library (DNN)
these Intel libraries are widely used by various well-known open-source projects (e.g. OpenCV or TensorFlow).
Unfortunately, these Intel libraries cannot be accepted into ConanCenter due to several important reasons:
- they are closed-source and commercial products, ConanCenter cannot redistribute their binaries due to the license restrictions
- registration on the Intel portal is required in order to dowload the libraries, there are no permanent public direct download links
- they use graphical installers which are hard to automate within conan recipe
instead, the libraries that depend on MKL, IPP or DNN should use the following references:
intel-mkl/<version>
, e.g.intel-mkl/2021
intel-ipp/<version>
, e.g.intel-ipp/2021
intel-dnn/<version>
, e.g.intel-dnn/2021
NOTE: These references are not available in ConanCenter and will likely never be! it's the consumer's responsibility to provide the recipes for these libraries.
Since these references will be never available in ConanCenter, they will be deactivated in the consuming recipes by default:
options = {
"shared": [True, False],
"fPIC": [True, False],
"with_intel_mkl": [True, False]}
default_options = {
"shared": False,
"fPIC": True,
"with_intel_mkl": False}
def requirements(self):
if self.options.with_intel_mkl:
self.requires("intel-mkl/2021")
If consumers activate the option explicitly (with_intel_mkl=True
), Conan will fail because of the unknown reference.
Consumers may use an override facility in order to use their own private references for Intel MKL, IPP or DNN libraries.
For instance, if you have a private reference intel-mkl/2021@mycompany/stable
, then you may use the following override in your conanfile.txt
:
[requires]
intel-mkl/2021@mycompany/stable
This repository and the CI building recipes is continuosly pushing to new Conan versions, sometimes adopting new features as soon as they are released (Conan client changelog).
You should expect that latest revision of recipes can introduce breaking changes and new features that will be broken unless you also upgrade Conan client (and sometimes you will need to modify your project if the recipe changes the binaries, flags,... it provides).
To isolate from this changes there are different strategies you can follow:
The minimum solution involves small changes to your Conan client configuration by
- Pin the version of every reference you consume in your project using either:
- recipe revision (RREV):
foo/1.0@#RREV
instead offoo/1.0
in your conanfile. - lockfiles.
- recipe revision (RREV):
For larger projects and teams it is recommended to add some infrastructure to ensure stability by
- Cache recipes in your own Artifactory: your project should use only this remote and new recipe revisions are only pushed to your Artifactory after they have been validated in your project.
Keep reading in the consuming recipes section.
Version ranges are a useful Conan feature, find the documentation here. However, in the context of ConanCenter they pose a few key challenges, most notably:
- Non-Determinstic
package-id
With version ranges the newest compatible package may yield a different package-id than the one built and published by ConanCenter resulting in frustrating error "no binaries found". For more context see this excellent explanation.
- Build Reproducibility
If consumers try to download and build the recipe at a later time, it may resolve to a different package version that may generate a different binary (that may or may not be compatible). In order to prevent these types of issues, we have decided to only allow exact requirements versions. This is a complicated issue, check this thread for more.
When the CI builds packages with shared=True
, it applies the option only to the package being created, but not to
the requirements. As the default value for the shared
option is usually False
, you can expect that the dynamic
library that has just being generated has linked all its requirements as static libraries.
It is important to remark the default package id mode
used by Conan (which is the same default used by ConanCenter): semver_direct_mode
. With this default only the major
version of the requirements is encoded in the package ID.
The two previous behaviors together can lead to unexpected results for a user that want to consume a graph of
dependencies as shared libraries from ConanCenter. They might think that using *:shared=True
in their profile is
enough, and indeed Conan will retrieve from ConanCenter all the dynamic libraries for all the graph of dependencies, but
all of them will contain the logic of their respective requirements embedded in the dynamic library, and this
logic is embedded at the time of building, so it might not match the version of the requirements that was resolved
by Conan, and for sure, the other dynamic libraries won't be used, only the ones linked directly by the consumer
project. See a more detailed example here.
In order to consume all those libraries as shared ones, building from sources is needed. This can be
easily achievable using *:shared=True
in the host profile and --build
in the install command. With these inputs,
Conan will build from sources all the packages and use the shared libraries when linking.
ℹ️ Note: If you are hosting your own recipes, the proper solution for recipes would be to use something like
shared_library_package_id
, that will encode this information in the package ID and ensure that any change in the static libraries that are embedded into a shared one is taken into account when computing the package ID.In this repository we are not using it, because it will lead to many missing packages, making it impossible for the CI to actually build consumers in PRs.