Skip to content

Commit

Permalink
Built site for gh-pages
Browse files Browse the repository at this point in the history
  • Loading branch information
Quarto GHA Workflow Runner committed Oct 2, 2023
1 parent c61e2e4 commit 7fd7d8e
Show file tree
Hide file tree
Showing 4 changed files with 10 additions and 19 deletions.
2 changes: 1 addition & 1 deletion .nojekyll
Original file line number Diff line number Diff line change
@@ -1 +1 @@
384428fe
646e57b5
15 changes: 3 additions & 12 deletions goals.html
Original file line number Diff line number Diff line change
Expand Up @@ -6,7 +6,7 @@

<meta name="viewport" content="width=device-width, initial-scale=1.0, user-scalable=yes">

<meta name="description" content="The FAIR Universe project is made up of a diverse set of researchers across high energy physics, cosmology, and machine learning.">
<meta name="description" content="We are building an open, large-compute-scale AI ecosystem for sharing datasets, training large models, fine-tuning those models, and hosting challenges and benchmarks. Hosting of all these datasets and benchmarks will be achieved by interfacing the NERSC HPC center to Codabench, a recently developed open-source platform, to provide a next genera- tion reproducible-science AI ecosystem.">

<title>FAIR Universe - Goals</title>
<style>
Expand Down Expand Up @@ -118,13 +118,7 @@
<div id="quarto-content" class="quarto-container page-columns page-rows-contents page-layout-article page-navbar">
<!-- sidebar -->
<nav id="quarto-sidebar" class="sidebar collapse collapse-horizontal sidebar-navigation floating overflow-auto">
<nav id="TOC" role="doc-toc" class="toc-active">
<h2 id="toc-title">Contents</h2>

<ul>
<li><a href="#goals" id="toc-goals" class="nav-link active" data-scroll-target="#goals">Goals</a></li>
</ul>
</nav>

</nav>
<div id="quarto-sidebar-glass" data-bs-toggle="collapse" data-bs-target="#quarto-sidebar,#quarto-sidebar-glass"></div>
<!-- margin-sidebar -->
Expand All @@ -140,7 +134,7 @@ <h1 class="title">Goals</h1>

<div>
<div class="description">
<p>The FAIR Universe project is made up of a diverse set of researchers across high energy physics, cosmology, and machine learning.</p>
<p>We are building an open, large-compute-scale AI ecosystem for sharing datasets, training large models, fine-tuning those models, and hosting challenges and benchmarks. Hosting of all these datasets and benchmarks will be achieved by interfacing the NERSC HPC center to Codabench, a recently developed open-source platform, to provide a next genera- tion reproducible-science AI ecosystem.</p>
</div>
</div>

Expand All @@ -155,8 +149,6 @@ <h1 class="title">Goals</h1>

</header>

<section id="goals" class="level1">
<h1>Goals</h1>
<p>Tackling the next generation of AI applications for high energy physics (HEP), in particular those that are uncertainty-aware, requires the creation of an ecosystem that can enable community access to datasets, benchmarks and existing algorithms backed by large-scale compute. This project will build the essential pieces of such an ecosystem through deployment of:</p>
<ol type="1">
<li>Three HEP systematic uncertainty datasets and tasks, of increasing sophistication, tailored for studies of systematic-uncertainty aware AI techniques, in particle physics and cosmology.</li>
Expand All @@ -166,7 +158,6 @@ <h1>Goals</h1>
<p>The collaboration with Codabench and NERSC will ensure that the project platform, benchmarks and a portfolio of algorithms will be curated and made accessible, and therefore continue to benefit the HEP community, as well as other sciences and the machine learning research community well beyond the end of the project. The research community will benefit from being exposed to well-established, empirical UQ approaches for estimation that experimenters have deployed on problems with hundreds of systematic effects. The develop- ment of principled methodologies to quantify the impact of systematic effects in the training and inference of ML models, will increase the trust of the scientific community on AI methods applied to experimental high-energy physics and beyond. The progressive structure of our challenges will bring together activity across particle physics and cosmology. Finally, both the methods and platform developed in this project will serve as a foundation for future AI challenges and benchmarks in high-energy physics, scientific and industrial applications.</p>


</section>

</main> <!-- /main -->
<script id="quarto-html-after-body" type="application/javascript">
Expand Down
2 changes: 1 addition & 1 deletion search.json
Original file line number Diff line number Diff line change
Expand Up @@ -25,7 +25,7 @@
"href": "goals.html",
"title": "Goals",
"section": "",
"text": "Goals\nTackling the next generation of AI applications for high energy physics (HEP), in particular those that are uncertainty-aware, requires the creation of an ecosystem that can enable community access to datasets, benchmarks and existing algorithms backed by large-scale compute. This project will build the essential pieces of such an ecosystem through deployment of:\n\nThree HEP systematic uncertainty datasets and tasks, of increasing sophistication, tailored for studies of systematic-uncertainty aware AI techniques, in particle physics and cosmology.\nA set of HEP-AI challenges and long-lived task and algorithm benchmarks addressing compelling questions about the impact of systematic effects in AI models.\nAn HPC-enabled AI benchmark platform capable of hosting datasets and models; producing new simulated datasets; applying new AI algorithms on existing datasets; and applying uploaded AI algo- rithms on new datasets.\n\nThe collaboration with Codabench and NERSC will ensure that the project platform, benchmarks and a portfolio of algorithms will be curated and made accessible, and therefore continue to benefit the HEP community, as well as other sciences and the machine learning research community well beyond the end of the project. The research community will benefit from being exposed to well-established, empirical UQ approaches for estimation that experimenters have deployed on problems with hundreds of systematic effects. The develop- ment of principled methodologies to quantify the impact of systematic effects in the training and inference of ML models, will increase the trust of the scientific community on AI methods applied to experimental high-energy physics and beyond. The progressive structure of our challenges will bring together activity across particle physics and cosmology. Finally, both the methods and platform developed in this project will serve as a foundation for future AI challenges and benchmarks in high-energy physics, scientific and industrial applications."
"text": "Tackling the next generation of AI applications for high energy physics (HEP), in particular those that are uncertainty-aware, requires the creation of an ecosystem that can enable community access to datasets, benchmarks and existing algorithms backed by large-scale compute. This project will build the essential pieces of such an ecosystem through deployment of:\n\nThree HEP systematic uncertainty datasets and tasks, of increasing sophistication, tailored for studies of systematic-uncertainty aware AI techniques, in particle physics and cosmology.\nA set of HEP-AI challenges and long-lived task and algorithm benchmarks addressing compelling questions about the impact of systematic effects in AI models.\nAn HPC-enabled AI benchmark platform capable of hosting datasets and models; producing new simulated datasets; applying new AI algorithms on existing datasets; and applying uploaded AI algo- rithms on new datasets.\n\nThe collaboration with Codabench and NERSC will ensure that the project platform, benchmarks and a portfolio of algorithms will be curated and made accessible, and therefore continue to benefit the HEP community, as well as other sciences and the machine learning research community well beyond the end of the project. The research community will benefit from being exposed to well-established, empirical UQ approaches for estimation that experimenters have deployed on problems with hundreds of systematic effects. The develop- ment of principled methodologies to quantify the impact of systematic effects in the training and inference of ML models, will increase the trust of the scientific community on AI methods applied to experimental high-energy physics and beyond. The progressive structure of our challenges will bring together activity across particle physics and cosmology. Finally, both the methods and platform developed in this project will serve as a foundation for future AI challenges and benchmarks in high-energy physics, scientific and industrial applications."
},
{
"objectID": "index.html",
Expand Down
10 changes: 5 additions & 5 deletions sitemap.xml
Original file line number Diff line number Diff line change
Expand Up @@ -2,22 +2,22 @@
<urlset xmlns="http://www.sitemaps.org/schemas/sitemap/0.9">
<url>
<loc>https://fair-universe.github.io/about.html</loc>
<lastmod>2023-10-02T20:03:59.652Z</lastmod>
<lastmod>2023-10-02T20:07:57.687Z</lastmod>
</url>
<url>
<loc>https://fair-universe.github.io/events.html</loc>
<lastmod>2023-10-02T20:03:59.028Z</lastmod>
<lastmod>2023-10-02T20:07:57.055Z</lastmod>
</url>
<url>
<loc>https://fair-universe.github.io/goals.html</loc>
<lastmod>2023-10-02T20:03:58.076Z</lastmod>
<lastmod>2023-10-02T20:07:56.083Z</lastmod>
</url>
<url>
<loc>https://fair-universe.github.io/team.html</loc>
<lastmod>2023-10-02T20:03:58.724Z</lastmod>
<lastmod>2023-10-02T20:07:56.747Z</lastmod>
</url>
<url>
<loc>https://fair-universe.github.io/index.html</loc>
<lastmod>2023-10-02T20:03:59.352Z</lastmod>
<lastmod>2023-10-02T20:07:57.383Z</lastmod>
</url>
</urlset>

0 comments on commit 7fd7d8e

Please sign in to comment.