From 03b44be7d1102b6290446c096dc1a93f48775a24 Mon Sep 17 00:00:00 2001 From: Renata Bartlett <66442872+r-bartlett-gsa@users.noreply.github.com> Date: Wed, 10 Jul 2024 16:50:37 -0500 Subject: [PATCH 01/47] Update privacy-policy.md Removed a broken link. Changed outbound link to open in a new tab. --- pages/privacy-policy.md | 10 +++++----- 1 file changed, 5 insertions(+), 5 deletions(-) diff --git a/pages/privacy-policy.md b/pages/privacy-policy.md index 4ce2512b9..eac213e7e 100644 --- a/pages/privacy-policy.md +++ b/pages/privacy-policy.md @@ -23,7 +23,7 @@ There are two kinds of cookies: Distinguish unique users (last for up to 2 years if you never clear your cookies) Throttle the request rate (last for up to 1 minute) -If you do not wish to accept cookies, you can edit your browser's options to stop accepting persistent cookies or to prompt you before accepting a cookie from the websites you visit. See additional information [on disabling cookies and/or Google demographic and interests reports](https://www.usa.gov/optout-instructions). +If you do not wish to accept cookies, you can edit your browser's options to stop accepting persistent cookies or to prompt you before accepting a cookie from the websites you visit. See additional information [on disabling cookies](https://www.usa.gov/optout-instructions){:target="_blank"}. Note: Although using persistent cookies allows us to deliver a better experience for you, this site will also work without them. @@ -31,7 +31,7 @@ Note: Although using persistent cookies allows us to deliver a better experience Users are NOT required to provide any information to search, retrieve, download, filter and otherwise use the data available on Challenge.Gov. If you choose to provide us with personal information—like sending an email to Challenge.Gov to ask questions—we use that information to respond to your message, and to help get you the information you requested. We only share the information you give us with another government agency to assist in answering your questions and to better understand user needs for Challenge.Gov, or as otherwise required by law. Any email address provided in connection with your question or suggestion will not be publicly viewable on the website. Challenge.Gov never collects information or creates individual profiles for commercial marketing. -In contacting Challenge.Gov with your questions and comments, you should NOT include additional personal information, especially [Social Security numbers](https://www.ssa.gov/pubs/EN-05-10002.pdf). Challenge.Gov is NOT a Privacy Act System of Record. +In contacting Challenge.Gov with your questions and comments, you should NOT include additional personal information, especially Social Security numbers. Challenge.Gov is NOT a Privacy Act System of Record. ## Security @@ -42,9 +42,9 @@ Unauthorized attempts to upload information or change information on GSA servers While Challenge.Gov uses social media including Facebook, Twitter, and YouTube, no personally identifiable information (PII) is sought or provided to GSA as a result of our use of these platforms. -The pages on Challenge.Gov may include hypertext links or pointers to information created and maintained by other public and private organizations. Check the [linking policy](https://www.gsa.gov/website-information/linking-policy) for more information. +The pages on Challenge.Gov may include hypertext links or pointers to information created and maintained by other public and private organizations. For more information on privacy and security: -* See [GSA's Privacy and Security Policy](https://www.gsa.gov/reference/gsa-privacy-program). -* [Contact us](mailto:team@challenge.gov) with questions. +* See [GSA's Privacy and Security Policy](https://www.gsa.gov/reference/gsa-privacy-program){:target="_blank"}. +* [Contact us](https://www.challenge.gov/contact/) with questions. From 49fb628e40be921ee9270ffc153b11d15cde49f4 Mon Sep 17 00:00:00 2001 From: Renata Bartlett <66442872+r-bartlett-gsa@users.noreply.github.com> Date: Thu, 11 Jul 2024 07:52:47 -0500 Subject: [PATCH 02/47] Update federal-agency-faqs.html Updated links throughout --- pages/federal-agency-faqs.html | 12 ++++++------ 1 file changed, 6 insertions(+), 6 deletions(-) diff --git a/pages/federal-agency-faqs.html b/pages/federal-agency-faqs.html index 88badefcd..9bed17b51 100644 --- a/pages/federal-agency-faqs.html +++ b/pages/federal-agency-faqs.html @@ -55,7 +55,7 @@
Note: Challenge.Gov highly recommends you engage your agency’s legal counsel for advice regarding the America COMPETES Act and prize competitions.
@@ -75,7 +75,7 @@To stay up-to-date on upcoming events and resources available to support your work, click here to subscribe to Federal Challenge Monthly, the Challenge.Gov community e-Newsletter.
+To stay up-to-date on upcoming events and resources available to support your work, subscribe to the Challenge.Gov community e-Newsletter.
Once your account has been created, a member of the Challenge.Gov Support Team will review your account and grant you access to the Challenge Manager portal. Please allow up to two (2) business days for your request to be processed.
-Note: Challenge.Gov uses Login.gov to manage user accounts and system access. Visit Login.gov
+ Note: Challenge.Gov uses Login.gov to manage user accounts and system access. Visit Login.gov
for more information. Click the “Login” button in the upper right corner of Challenge.Gov to get started. See the
step-by-step instructions below for details. Note: Challenge.Gov uses Login.gov to manage user accounts and system access. Visit Login.gov
+ Note: Challenge.Gov uses Login.gov to manage user accounts and system access. Visit Login.gov
for more information. The America COMPETES Reauthorization Act of 2010 (COMPETES) provides prize authority to the head of each federal agency. In 2017, COMPETES was amended with the passage of the American Innovation and Competitiveness Act of 2017.
+ The America COMPETES Reauthorization Act of 2010 (COMPETES) provides prize authority to the head of each federal agency. In 2017, COMPETES was amended with the passage of the American Innovation and Competitiveness Act of 2017.
- As described in the M-10-11 memo "Guidance on the Use of Challenges and Prizes to Promote Open Government" several considerations are applicable when determining the authority under which a prize competition may be conducted at an agency. These considerations and approaches may include direct statutory to conduct prizes, grants and cooperative agreements, procurement authority, other transaction authority, or partnership authority. As described in the M-10-11 memo "Guidance on the Use of Challenges and Prizes to Promote Open Government" several considerations are applicable when determining the authority under which a prize competition may be conducted at an agency. These considerations and approaches may include direct statutory to conduct prizes, grants and cooperative agreements, procurement authority, other transaction authority, or partnership authority. The America COMPETES Reauthorization Act of 2010 (COMPETES) and the American Innovation and Competitiveness Act (AICA) require the White House Office of Science and Technology Policy to produce a biennial report to Congress on the activities carried out under these authorities. This report highlights how government agencies use prize competitions and challenges (PC&Cs) and crowdsourcing and citizen science (CCS) activities to engage with members of the public to innovate, drive scientific discovery, and solve important problems. Reporting on related activities conducted under other authorities is voluntary and also included. If you’re a federal employee with a great idea or public solver who wants to share your expertise, the Challenge.Gov team is here to support you. Contact team@Challenge.Gov, sign up for our newsletter, or follow us on social media: Twitter @ChallengeGov, Facebook @ChallengeGov. If you’re a federal employee with a great idea or public solver who wants to share your expertise, the Challenge.Gov team is here to support you. Contact team@Challenge.Gov, sign up for our newsletter, or follow us on Facebook @ChallengeGov.
Getting Started With the Challenge.Gov platform?
General Questions About Prize Competitions
Do all agencies have legal authority to run a prize competition?
- White House, Office of Management and Budget (OMB), and Government-Wide Guidance
-
White House: Prizes and Challenge Initiatives Addressing COVID-19
-
Sample Agency Policies and Capacity-Building Resources
U.S. Department of Health and Human Services (HHS)
-
U.S. Department of the Interior (DOI)
-
National Aeronautics and Space Administration (NASA)
-
National Institute of Standards and Technology (NIST)
-
U.S. Agency for International Development (USAID)
-
U.S. Department of Agriculture (USDA)
-
U.S. Department of Energy (DOE)
-
Implementation of Federal Prize Authority
-
-
Reports and Publications on Public-Sector Prizes
-
Other Blogs and Fact Sheets
Challenge.Gov
-
Challenge Marketing
Videos
-
Articles and Fact Sheets
-
Blogs
-
From 040b461db4e19f4c0233ccee0d1364f0e401de90 Mon Sep 17 00:00:00 2001
From: Renata Bartlett <66442872+r-bartlett-gsa@users.noreply.github.com>
Date: Thu, 11 Jul 2024 08:22:57 -0500
Subject: [PATCH 04/47] Update resources.md
Updated first link on the list for consistency
---
pages/toolkit/resources.md | 2 +-
1 file changed, 1 insertion(+), 1 deletion(-)
diff --git a/pages/toolkit/resources.md b/pages/toolkit/resources.md
index 034e2eccf..4a78590c9 100644
--- a/pages/toolkit/resources.md
+++ b/pages/toolkit/resources.md
@@ -9,7 +9,7 @@ title: Toolkit - Resources
White House, Office of Management and Budget (OMB), and Government-Wide Guidance
-
-
-
Who are you?
-My name is Tim K. Mackey, and I am the co-founder and CEO of [S-3 Research LLC](https://www.s-3.io/). I’m essentially a researcher-turned-entrepreneur with the help of the U.S. government through the SUD Startup Challenge award and the [NIDA Small Business Innovation Research (SBIR) program](https://sbir.nih.gov/nida/index). I am also a current associate professor at UC San Diego where I teach and research on global health, health technology, and public policy. +My name is Tim K. Mackey, and I am the co-founder and CEO of [S-3 Research LLC](https://www.s-3.io/){:target="_blank"}. I’m essentially a researcher-turned-entrepreneur with the help of the U.S. government through the SUD Startup Challenge award and the [Small Business Innovation Research (SBIR) program](https://nida.nih.gov/funding/small-business-innovation-research-sbir-technology-transfer-sttr-programs){:target="_blank"}. I am also a current associate professor at UC San Diego where I teach and research on global health, health technology, and public policy. **What is the name of your company, where is it located, and what does it “do”?** @@ -50,11 +50,11 @@ My background is primarily academic, though I also worked over a decade in the p **What inspired you to focus on Substance Use Disorders?** -The scourge of the opioid epidemic and its toll on society is real and acute. I’ve been studying illegal sales of drugs online my whole career, and when the opioid epidemic hit, we saw a flood of diversion go online. I was inspired first as a Master’s student at UC San Diego when I went to a graduate seminar where a woman named Francis Haight talked about the tragedy of losing her teenage son, Ryan Haight, when he purchased Vicodin online and overdosed. Subsequently, [Federal legislation was passed in his name](https://www.congress.gov/110/plaws/publ425/PLAW-110publ425.pdf) to make it illegal to sell controlled substances online. From that point on, I decided that fighting illegal online sales of drugs would be one of my core goals as a researcher. +The scourge of the opioid epidemic and its toll on society is real and acute. I’ve been studying illegal sales of drugs online my whole career, and when the opioid epidemic hit, we saw a flood of diversion go online. I was inspired first as a Master’s student at UC San Diego when I went to a graduate seminar where a woman named Francis Haight talked about the tragedy of losing her teenage son, Ryan Haight, when he purchased Vicodin online and overdosed. Subsequently, [Federal legislation "Ryan Haight Online Pharmacy Consumer Protection Act of 2008"](https://www.congress.gov/110/plaws/publ425/PLAW-110publ425.pdf){:target="_blank"} was passed to make it illegal to sell controlled substances online. From that point on, I decided that fighting illegal online sales of drugs would be one of my core goals as a researcher. **How did you hear about NIDA’s SUD Startup Challenge and why did you decide to apply?** -We heard about the challenge after participating in the HHS 2017 Opioid Code-a-Thon after being invited to form a team by a colleague formally at the U.S. Centers for Disease Control and Prevention. We were chosen as a finalist but didn’t win one of the three prizes, however, this gave us the opportunity to learn about the Challenge award and we applied.
+We heard about the challenge after participating in the HHS 2017 Opioid Code-a-Thon after being invited to form a team by a colleague formally at the U.S. Centers for Disease Control and Prevention. We were chosen as a finalist but didn’t win one of the three prizes, however, this gave us the opportunity to learn about the Challenge award and we applied.
@@ -78,14 +78,14 @@ Dr. Koustova really encouraged us to put in the hard work of following a format **Can you share any stories of how your winning solution has affected someone’s life for the better?** -Our solution protects consumers, but we rarely get to see the fruits of our labor as we pass on our surveillance results to law enforcement and regulators to take action. That said, we are doing quite a bit of work now on [fake and counterfeit COVID-19 products](https://www.consumer.ftc.gov/blog/2020/04/covid-19-scam-reports-numbers) being sold online. We know that our efforts will help address one part of this pandemic, and we hope our solution ensures that the public is not harmed or scammed at this critical time for all of us. +Our solution protects consumers, but we rarely get to see the fruits of our labor as we pass on our surveillance results to law enforcement and regulators to take action. That said, we are doing quite a bit of work now on fake and counterfeit COVID-19 products being sold online,(["COVID-19 scam reports, by the numbers"](https://www.consumer.ftc.gov/blog/2020/04/covid-19-scam-reports-numbers){:target="_blank"}). We know that our efforts will help address one part of this pandemic, and we hope our solution ensures that the public is not harmed or scammed at this critical time for all of us.I just want to express my thanks to all the people at the NIH and specifically the NIDA SBIR program. They are the reason why our innovation is being translated into something that will directly impact people’s lives, and hopefully, help bring an end to this terrible opioid epidemic.
\ No newline at end of file +I just want to express my thanks to all the people at the NIH and specifically the NIDA SBIR program. They are the reason why our innovation is being translated into something that will directly impact people’s lives, and hopefully, help bring an end to this terrible opioid epidemic.
From a27a43c00e5f9deaf8e41d5f58a95b86cc0a8cf8 Mon Sep 17 00:00:00 2001 From: Renata Bartlett <66442872+r-bartlett-gsa@users.noreply.github.com> Date: Thu, 11 Jul 2024 14:03:22 -0500 Subject: [PATCH 08/47] Update 2020-07-28-winner-q-a-with-tim-mackey-of-s-3-research-llc.md Typo fix --- ...2020-07-28-winner-q-a-with-tim-mackey-of-s-3-research-llc.md | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/_posts/2020-07-28-winner-q-a-with-tim-mackey-of-s-3-research-llc.md b/_posts/2020-07-28-winner-q-a-with-tim-mackey-of-s-3-research-llc.md index 116b4e0ad..da486d710 100644 --- a/_posts/2020-07-28-winner-q-a-with-tim-mackey-of-s-3-research-llc.md +++ b/_posts/2020-07-28-winner-q-a-with-tim-mackey-of-s-3-research-llc.md @@ -78,7 +78,7 @@ Dr. Koustova really encouraged us to put in the hard work of following a format **Can you share any stories of how your winning solution has affected someone’s life for the better?** -Our solution protects consumers, but we rarely get to see the fruits of our labor as we pass on our surveillance results to law enforcement and regulators to take action. That said, we are doing quite a bit of work now on fake and counterfeit COVID-19 products being sold online,(["COVID-19 scam reports, by the numbers"](https://www.consumer.ftc.gov/blog/2020/04/covid-19-scam-reports-numbers){:target="_blank"}). We know that our efforts will help address one part of this pandemic, and we hope our solution ensures that the public is not harmed or scammed at this critical time for all of us. +Our solution protects consumers, but we rarely get to see the fruits of our labor as we pass on our surveillance results to law enforcement and regulators to take action. That said, we are doing quite a bit of work now on fake and counterfeit COVID-19 products being sold online, (["COVID-19 scam reports, by the numbers"](https://www.consumer.ftc.gov/blog/2020/04/covid-19-scam-reports-numbers){:target="_blank"}). We know that our efforts will help address one part of this pandemic, and we hope our solution ensures that the public is not harmed or scammed at this critical time for all of us.As recent events have shown, in times of great crisis, federal agencies can tap into the country’s vast network of innovators, makers, and citizen-solvers to meet—and overcome—some of the most complex and unanticipated challenges of the day through rapidly scaled prize competitions. As the web platform that connects federal agencies with the public to crowdsource solutions to problems both big and small, Challenge.gov is proud to play an important role in the ongoing fight against this deadly disease.
\ No newline at end of file +As recent events have shown, in times of great crisis, federal agencies can tap into the country’s vast network of innovators, makers, and citizen-solvers to meet—and overcome—some of the most complex and unanticipated challenges of the day through rapidly scaled prize competitions. As the web platform that connects federal agencies with the public to crowdsource solutions to problems both big and small, Challenge.gov is proud to play an important role in the ongoing fight against this deadly disease.
From f734b23ab5fd17727fe017cbe1c376c68ff46dc0 Mon Sep 17 00:00:00 2001 From: Renata Bartlett <66442872+r-bartlett-gsa@users.noreply.github.com> Date: Thu, 11 Jul 2024 14:36:54 -0500 Subject: [PATCH 10/47] Update 2020-06-02-tracking-a-virus.md updated links throughout --- _posts/2020-06-02-tracking-a-virus.md | 10 +++++----- 1 file changed, 5 insertions(+), 5 deletions(-) diff --git a/_posts/2020-06-02-tracking-a-virus.md b/_posts/2020-06-02-tracking-a-virus.md index 8dfdc53f8..ddf6f57e0 100644 --- a/_posts/2020-06-02-tracking-a-virus.md +++ b/_posts/2020-06-02-tracking-a-virus.md @@ -17,11 +17,11 @@ Shaman has spent nearly two decades studying the effects of atmospheric conditio But today, the New York-based professor is tracking another virus. -On several weekly calls facilitated by the Centers for Disease Control and Prevention (CDC), Shaman and other researchers are discussing inference, operations forecasts and projections for COVID-19, the disease at the center of a global pandemic that has resulted in more than 349,095 deaths worldwide. +On several weekly calls facilitated by the Centers for Disease Control and Prevention (CDC), Shaman and other researchers are discussing inference, operations forecasts and projections for COVID-19, the disease at the center of a global pandemic that has resulted in more than 349,095 deaths worldwide. Shaman and his team know about infectious disease projections. -In 2014, they claimed the $75,000 top prize in the CDC’s Predict the Influenza Season Challenge. The prize competition challenged participants to develop more advanced, real-time flu forecasts to inform prevention strategies and control tactics. +In 2014, they claimed the $75,000 top prize in the CDC’s Predict the Influenza Season Challenge. The prize competition challenged participants to develop more advanced, real-time flu forecasts to inform prevention strategies and control tactics. The challenge launched in 2013, and at the time, forecasting the spread of influenza had not been considered particularly feasible in any sort of operational way, explains Matthew Biggerstaff, a research epidemiologist in the CDC’s Epidemiology and Prevention Branch, Influenza Division. @@ -42,7 +42,7 @@ The challenge launched in 2013, and at the time, forecasting the spread of influ The competition allowed CDC to see which research groups were working in the space and to evaluate then-state-of-the-art concepts, he says. -Just four years prior, the H1N1 influenza virus surged into a global pandemic. Between April 2009 and 2010, the CDC estimates the respiratory condition claimed the lives of 12,469 Americans and between 151,700 and 575,400 people around the world. +Just four years prior, the H1N1 influenza virus surged into a global pandemic. Between April 2009 and 2010, the CDC estimates the respiratory condition claimed the lives of 12,469 Americans and between 151,700 and 575,400 people around the world. H1N1's swift spread identified a critical gap—the ability to rapidly and accurately forecast the spread of illnesses and pandemics. @@ -56,7 +56,7 @@ Today, public health officials not only are familiar with the science, mathemati Shaman’s participation provided funds to support students and personnel in his laboratory, and helped his team gain visibility for follow-on research funding from agencies including the National Institutes of Health and the Department of Defense. -In fact, the CDC challenge opened the door for other competitions that followed, including the Defense Advanced Projects Research Agency (DARPA) CHIKV Challenge, a $150,000 competition which sought models for the spread of the chikungunya virus. +In fact, the CDC challenge opened the door for other competitions that followed, including the Defense Advanced Projects Research Agency (DARPA) CHIKV Challenge, a $150,000 competition which sought models for the spread of the chikungunya virus. Prize-winning University of Arizona professors Joceline Lega and Heidi Brown won that challenge for developing a mathematical model that forecasts the chikungunya infection case counts as the disease emerged. @@ -82,4 +82,4 @@ The number of rapidly spreading diseases over the past few decades shows the nee“I’d definitely call this an emerging science, and I think there’s still a lot to do,” Biggerstaff says. “Galvanizing this community through the [Predict the Flu Challenge] helped CDC lead this and put a stake in the ground that the agency thought this was an important area to invest in and be a part of.”
\ No newline at end of file +">“I’d definitely call this an emerging science, and I think there’s still a lot to do,” Biggerstaff says. “Galvanizing this community through the [Predict the Flu Challenge] helped CDC lead this and put a stake in the ground that the agency thought this was an important area to invest in and be a part of.” From 3a298b65fa2a99738b00f1ffcbc50465dd6b4d29 Mon Sep 17 00:00:00 2001 From: Renata Bartlett <66442872+r-bartlett-gsa@users.noreply.github.com> Date: Fri, 12 Jul 2024 15:19:11 -0500 Subject: [PATCH 11/47] Update gear-center-challenge.md Updated links trhoughout --- .../case-studies/gear-center-challenge.md | 60 ++++++------------- 1 file changed, 18 insertions(+), 42 deletions(-) diff --git a/pages/toolkit/case-studies/gear-center-challenge.md b/pages/toolkit/case-studies/gear-center-challenge.md index e4c007997..49969e47f 100644 --- a/pages/toolkit/case-studies/gear-center-challenge.md +++ b/pages/toolkit/case-studies/gear-center-challenge.md @@ -24,15 +24,13 @@ title: Case Study - GEAR Center ChallengeScale new solutions to boldly tackle government's most complex management challenges.
Over the summer of 2019, the General Services Administration (GSA) and the Office of Management and Budget (OMB) ran a GEAR Center prize competition on Challenge.gov. The competition challenged problem solvers from the public, academia, and industry to build cross-sector, multidisciplinary teams to demonstrate the potential of the GEAR Center .Teams described - how they would tackle challenges facing the government as outlined in the President's Management Agenda (PMA).
+Over the summer of 2019, the General Services Administration (GSA) and the Office of Management and Budget (OMB) ran a GEAR Center prize competition on Challenge.gov. The competition challenged problem solvers from the public, academia, and industry to build cross-sector, multidisciplinary teams to demonstrate the potential of the GEAR Center. Teams described how they would tackle challenges facing the government as outlined in the
GSA and OMB ran a successful prize competition on Challenge.gov. Leaders from across federal agencies acted as judges, with GSA also tapping subject-matter experts (SMEs) from Cross-Agency Priority (CAP) Goal teams - and other government-wide initiatives.
+GSA and OMB ran a successful prize competition on Challenge.gov. Leaders from across federal agencies acted as judges, with GSA also tapping subject-matter experts (SMEs) from Cross-Agency Priority (CAP) Goal teams and other government-wide initiatives.
Forty-nine eligible GEAR Center project proposals came in from solver teams representing:
After an intense three-phase evaluation process, three grand-prize winners and five honorable mentions were selected. The grand-prize winners each received $300,000. There were no cash awards for the honorable mentions.
Cybersecurity Workforce Collaboration - Under this solution, a federal neurodiversity cyber workforce will be established to focus on training a particular federal agency to identify, hire, onboard, - train, support, and retain neurodiverse individuals for cyber positions. This pilot program will be facilitated by winning team members including George Mason University, Mercyhurst University, Rochester Institute - of Technology, University of Maryland, Drexel University, SAP, Specialisterne, DXC Dandelion Program, and the MITRE Corporation.
-Data for Impact - Currently, data on federally funded workforce, education, and human services programs are too often held in silos that prevent local, state, and federal agencies from assessing the - true impact of their joint service delivery. This solution, a collaboration between SkillSource Group and Third Sector Capital Partners, Inc., strives to improve government use of administrative data to measure - impact. This team will pilot an approach to integrate currently disparate data that builds on existing state data integration efforts. The team will use many administrative data sources to measure the impact of - Workforce Innovation and Opportunity Act (WIOA) services for Virginia Opportunity Youth with past involvement with the child welfare and/or criminal justice systems.
-Data and Evidence for Government and Academic Impact - This project aims to help 250 federal practitioners in Kansas City by customizing an existing training curriculum and recommending how to replicate - and scale it in other regions. This collaboration focuses on improving the use of evidence and data by the public sector workforce among the Johns Hopkins University Centers for Civic Impact, and Volcker Alliance's - Government-to-University Initiative, and the Mid-America Regional Council.
+Cybersecurity Workforce Collaboration - Under this solution, a federal neurodiversity cyber workforce will be established to focus on training a particular federal agency to identify, hire, onboard, train, support, and retain neurodiverse individuals for cyber positions. This pilot program will be facilitated by winning team members including George Mason University, Mercyhurst University, Rochester Institute of Technology, University of Maryland, Drexel University, SAP, Specialisterne, DXC Dandelion Program, and the MITRE Corporation.
+Data for Impact - Currently, data on federally funded workforce, education, and human services programs are too often held in silos that prevent local, state, and federal agencies from assessing the true impact of their joint service delivery. This solution, a collaboration between SkillSource Group and Third Sector Capital Partners, Inc., strives to improve government use of administrative data to measure impact. This team will pilot an approach to integrate currently disparate data that builds on existing state data integration efforts. The team will use many administrative data sources to measure the impact of Workforce Innovation and Opportunity Act (WIOA) services for Virginia Opportunity Youth with past involvement with the child welfare and/or criminal justice systems.
+Data and Evidence for Government and Academic Impact - This project aims to help 250 federal practitioners in Kansas City by customizing an existing training curriculum and recommending how to replicate and scale it in other regions. This collaboration focuses on improving the use of evidence and data by the public sector workforce among the Johns Hopkins University Centers for Civic Impact, and Volcker Alliance's Government-to-University Initiative, and the Mid-America Regional Council.
Unlocking the Value of Government Data - Deloitte, Google, University of Maryland, and Datawheel collaborate to create pop-up data marketplaces.
Delivering the Workforce of the 21st Century - Launchcode's initiative to re-skill individuals for high needs jobs.
-Secure, Modern, and Mission-Capable Credentialing - This collaboration aims to improve the customer experience and efficiency of the credentialing process among its solvers, which include the Institute - for Defense Analyses, West Virginia Division of Homeland Security and Emergency Management, West Virginia National Guard, WVReady, University of Maryland-Center for Public Policy and Private Enterprise, and Marshall - University College of Information Technology and Engineering.
-Improving Grants Management Using Blockchain Technology - The MITRE Corporation team proposes demonstrating the benefits of a grants management operating model and blockchain-based Distributed Grants - Ledger using the joint efforts of private sector technology vendors, state government agencies, universities, and community-based service organizations.
+Secure, Modern, and Mission-Capable Credentialing - This collaboration aims to improve the customer experience and efficiency of the credentialing process among its solvers, which include the Institute for Defense Analyses, West Virginia Division of Homeland Security and Emergency Management, West Virginia National Guard, WVReady, University of Maryland-Center for Public Policy and Private Enterprise, and Marshall University College of Information Technology and Engineering.
+Improving Grants Management Using Blockchain Technology - The MITRE Corporation team proposes demonstrating the benefits of a grants management operating model and blockchain-based Distributed Grants Ledger using the joint efforts of private sector technology vendors, state government agencies, universities, and community-based service organizations.
The challenge was designed to run through three evaluation phases. This encouraged multiple types of solvers to participate given the low barrier to entry in their initial submission.
-As solver teams progressed through the evaluation process, they provided more detail on their project ideas. While multiple phases required more coordination and judging, they allowed for diverse judging panels who - offered multiple perspectives in their evaluations. They also enabled solver teams to address challenge requirements in more manageable segments.
+As solver teams progressed through the evaluation process, they provided more detail on their project ideas. While multiple phases required more coordination and judging, they allowed for diverse judging panels who offered multiple perspectives in their evaluations. They also enabled solver teams to address challenge requirements in more manageable segments.
Phase 1: Project Proposal
In the first phase, solvers submitted a two-page project proposal to scale or grow an existing initiative to deliver a relevant solution to a PMA-related challenge in one year. Solvers were asked to address the following:
The GEAR center challenge team hosted a webinar where PMA experts gave context on each of the PMA areas and helped answer questions. The team posted questions and answers from the webinar on the GEAR Center page on - Performance.gov.
-A panel of three judges used Phase 1 (P1) criteria (as stated in the challenge page) to evaluate 49 proposals and select 20 semifinalists to advance to the next phase.
+The GEAR center challenge team hosted a webinar where PMA experts gave context on each of the PMA areas and helped answer questions. The team posted questions and answers from the webinar on the GEAR Center page on Performance.gov.
+A panel of three judges used Phase 1 (P1) criteria to evaluate 49 proposals and select 20 semifinalists to advance to the next phase.
Phase 2: Project and GEAR Center Plan
In the second phase, the top 20 P1 solver teams were invited to submit a 10-page project plan and describe their ability to execute on it, as well as how this project would support a longer-term GEAR Center vision. They were asked to address the project plan and how easily they could do it, as well as GEAR Center model operation, impact, and sustainability. -The Gear Center challenge team hosted a second webinar to provide semifinalists more information on expectations for their submissions and to help answer questions. A panel of three judges (different from P1) used Phase - 2 (P2) criteria to evaluate 20 proposals and select 10 finalists to advance to the final round.
+The Gear Center challenge team hosted a second webinar to provide semifinalists more information on expectations for their submissions and to help answer questions. A panel of three judges (different from P1) used Phase 2 (P2) criteria to evaluate 20 proposals and select 10 finalists to advance to the final round.
Phase 3: Finalist Presentation
In the third phase, finalists presented their project proposals to a panel of federal executives, who were different from the judges from the first two phases.Teams presented their project proposals and engaged in an hour-long question-and-answer session with the judges. These sessions allowed judges to understand the innovation and the project feasibility, as well as whether the teams could deliver. The judges used Phase 3 criteria to evaluate presentations and select three grand-prize winning teams and five honorable-mention teams. The grand-prize winners each received $300,000, while no cash awards were given for honorable mention.Given the complex challenges targeted through PMA initiatives, we knew we needed input from SMEs working on those initiatives.
-The GEAR Center challenge team formed a network of CAP Goal team leaders and members, as well as other cross-government initiatives, to ensure the GEAR Center challenge would yield projects that would complement ongoing - efforts without duplicating them.
+The GEAR Center challenge team formed a network of CAP Goal team leaders and members, as well as other cross-government initiatives, to ensure the GEAR Center challenge would yield projects that would complement ongoing efforts without duplicating them.
This network was asked at several points to:
While engaging SMEs required extensive communications and coordination, their input was extremely valuable throughout the challenge. They were particularly helpful during Phase 2 and Phase 3 evaluations. Their deep - knowledge of specific initiatives helped determine proposed GEAR Center project feasibility and potential impact.
+While engaging SMEs required extensive communications and coordination, their input was extremely valuable throughout the challenge. They were particularly helpful during Phase 2 and Phase 3 evaluations. Their deep knowledge of specific initiatives helped determine proposed GEAR Center project feasibility and potential impact.
Providing ongoing communications and guidance was essential for success. We used a dedicated email account to communicate with solver teams throughout the challenge about timing and next-step expectations to ensure - that our process was transparent. We also used online meeting tools to engage with solver teams during the evaluation process. We used webinars to clarify intent for the first two phases, provide more context on - the PMA initiatives with SME input, and answer solver team questions. During Phase 3, finalists could deliver their presentations to judges in person, virtually using an online meeting tool, or by combining the - two.
+Providing ongoing communications and guidance was essential for success. We used a dedicated email account to communicate with solver teams throughout the challenge about timing and next-step expectations to ensure that our process was transparent. We also used online meeting tools to engage with solver teams during the evaluation process. We used webinars to clarify intent for the first two phases, provide more context on the PMA initiatives with SME input, and answer solver team questions. During Phase 3, finalists could deliver their presentations to judges in person, virtually using an online meeting tool, or by combining the two.
Throughout the challenge, we got the support of a highly capable and engaged cross-functional team. Our strategic communications partners helped us reach a wide range of quality solver teams from multiple sectors and to make clear announcements at key milestone events. The GEAR Center Challenge team’s general counsel provided timely legal advice throughout the challenge to ensure we conducted a transparent and sound process. Our budget office helped us to efficiently award payments to the three grand-prize winners. We engaged the budget office early in the process and had solver teams fill out necessary paperwork at the Phase 2 stage so that we could begin this financial clearance process early. The Challenge.gov team shared expert advice and best practices that helped us navigate every step of the challenge process.
While we stayed on schedule all the way up to the Phase 3 finalist presentations, we had some delays while clearing challenge results with key stakeholders. We told finalists of the delays during this period and worked - with our strategic communications partners and senior leaders to craft a clear announcement of challenge results. Once we completed the communications clearance process, we announced challenge results via multiple - channels.
+While we stayed on schedule all the way up to the Phase 3 finalist presentations, we had some delays while clearing challenge results with key stakeholders. We told finalists of the delays during this period and worked with our strategic communications partners and senior leaders to craft a clear announcement of challenge results. Once we completed the communications clearance process, we announced challenge results via multiple channels.
The GEAR Center was conceived as a way to promote innovation in support of the PMA. The GEAR Center Challenge project ideas showed how innovative cross-sector partnerships can transform government mission delivery, - service to citizens, and stewardship. By focusing the challenge on project idea proposals vs. solutions to specific problems, we were able to collect a diverse set of solver teams, given the broad scope and complexity - of PMA topics. This approach also helped us to better understand the types of projects and cross-sector partnerships that a GEAR Center would be best suited for.
+The GEAR Center was conceived as a way to promote innovation in support of the PMA. The GEAR Center Challenge project ideas showed how innovative cross-sector partnerships can transform government mission delivery, service to citizens, and stewardship. By focusing the challenge on project idea proposals vs. solutions to specific problems, we were able to collect a diverse set of solver teams, given the broad scope and complexity of PMA topics. This approach also helped us to better understand the types of projects and cross-sector partnerships that a GEAR Center would be best suited for.
This competition was conducted by GSA under the authority of the America COMPETES Reauthorization Act of 2010 (15 U.S. Code § 3719) as amended by the American Innovation and Competitiveness Act of 2017.
Over the summer of 2019, the General Services Administration (GSA) and the Office of Management and Budget (OMB) ran a GEAR Center prize competition on Challenge.gov. The competition challenged problem solvers from the public, academia, and industry to build cross-sector, multidisciplinary teams to demonstrate the potential of the GEAR Center. Teams described how they would tackle challenges facing the government as outlined in the
GSA and OMB ran a successful prize competition on Challenge.gov. Leaders from across federal agencies acted as judges, with GSA also tapping subject-matter experts (SMEs) from Cross-Agency Priority (CAP) Goal teams and other government-wide initiatives.
-Forty-nine eligible GEAR Center project proposals came in from solver teams representing:
+ Forty-nine eligible GEAR Center project proposals came in from solver teams representing:After an intense three-phase evaluation process, three grand-prize winners and five honorable mentions were selected. The grand-prize winners each received $300,000. There were no cash awards for the honorable mentions.
+ After an intense three-phase evaluation process, three grand-prize winners and five honorable mentions were selected. The grand-prize winners each received $300,000. There were no cash awards for the honorable mentions.Cybersecurity Workforce Collaboration - Under this solution, a federal neurodiversity cyber workforce will be established to focus on training a particular federal agency to identify, hire, onboard, train, support, and retain neurodiverse individuals for cyber positions. This pilot program will be facilitated by winning team members including George Mason University, Mercyhurst University, Rochester Institute of Technology, University of Maryland, Drexel University, SAP, Specialisterne, DXC Dandelion Program, and the MITRE Corporation.
Data for Impact - Currently, data on federally funded workforce, education, and human services programs are too often held in silos that prevent local, state, and federal agencies from assessing the true impact of their joint service delivery. This solution, a collaboration between SkillSource Group and Third Sector Capital Partners, Inc., strives to improve government use of administrative data to measure impact. This team will pilot an approach to integrate currently disparate data that builds on existing state data integration efforts. The team will use many administrative data sources to measure the impact of Workforce Innovation and Opportunity Act (WIOA) services for Virginia Opportunity Youth with past involvement with the child welfare and/or criminal justice systems.
@@ -76,16 +76,16 @@ title: Case Study - GEAR Center ChallengeThe GEAR center challenge team hosted a webinar where PMA experts gave context on each of the PMA areas and helped answer questions. The team posted questions and answers from the webinar on the GEAR Center page on Performance.gov.
-A panel of three judges used Phase 1 (P1) criteria to evaluate 49 proposals and select 20 semifinalists to advance to the next phase.
+ The GEAR center challenge team hosted a webinar where PMA experts gave context on each of the PMA areas and helped answer questions. The team posted questions and answers from the webinar on the GEAR Center page on Performance.gov. + A panel of three judges used Phase 1 (P1) criteria to evaluate 49 proposals and select 20 semifinalists to advance to the next phase.Phase 2: Project and GEAR Center Plan
In the second phase, the top 20 P1 solver teams were invited to submit a 10-page project plan and describe their ability to execute on it, as well as how this project would support a longer-term GEAR Center vision. They were asked to address the project plan and how easily they could do it, as well as GEAR Center model operation, impact, and sustainability. -The Gear Center challenge team hosted a second webinar to provide semifinalists more information on expectations for their submissions and to help answer questions. A panel of three judges (different from P1) used Phase 2 (P2) criteria to evaluate 20 proposals and select 10 finalists to advance to the final round.
+ The Gear Center challenge team hosted a second webinar to provide semifinalists more information on expectations for their submissions and to help answer questions. A panel of three judges (different from P1) used Phase 2 (P2) criteria to evaluate 20 proposals and select 10 finalists to advance to the final round.Phase 3: Finalist Presentation
In the third phase, finalists presented their project proposals to a panel of federal executives, who were different from the judges from the first two phases.Teams presented their project proposals and engaged in an hour-long question-and-answer session with the judges. These sessions allowed judges to understand the innovation and the project feasibility, as well as whether the teams could deliver. The judges used Phase 3 criteria to evaluate presentations and select three grand-prize winning teams and five honorable-mention teams. The grand-prize winners each received $300,000, while no cash awards were given for honorable mention.Given the complex challenges targeted through PMA initiatives, we knew we needed input from SMEs working on those initiatives.
-The GEAR Center challenge team formed a network of CAP Goal team leaders and members, as well as other cross-government initiatives, to ensure the GEAR Center challenge would yield projects that would complement ongoing efforts without duplicating them.
+ Given the complex challenges targeted through PMA initiatives, we knew we needed input from SMEs working on those initiatives. + The GEAR Center challenge team formed a network of CAP Goal team leaders and members, as well as other cross-government initiatives, to ensure the GEAR Center challenge would yield projects that would complement ongoing efforts without duplicating them.This network was asked at several points to:
While engaging SMEs required extensive communications and coordination, their input was extremely valuable throughout the challenge. They were particularly helpful during Phase 2 and Phase 3 evaluations. Their deep knowledge of specific initiatives helped determine proposed GEAR Center project feasibility and potential impact.
+ While engaging SMEs required extensive communications and coordination, their input was extremely valuable throughout the challenge. They were particularly helpful during Phase 2 and Phase 3 evaluations. Their deep knowledge of specific initiatives helped determine proposed GEAR Center project feasibility and potential impact.Providing ongoing communications and guidance was essential for success. We used a dedicated email account to communicate with solver teams throughout the challenge about timing and next-step expectations to ensure that our process was transparent. We also used online meeting tools to engage with solver teams during the evaluation process. We used webinars to clarify intent for the first two phases, provide more context on the PMA initiatives with SME input, and answer solver team questions. During Phase 3, finalists could deliver their presentations to judges in person, virtually using an online meeting tool, or by combining the two.
+ Providing ongoing communications and guidance was essential for success. We used a dedicated email account to communicate with solver teams throughout the challenge about timing and next-step expectations to ensure that our process was transparent. We also used online meeting tools to engage with solver teams during the evaluation process. We used webinars to clarify intent for the first two phases, provide more context on the PMA initiatives with SME input, and answer solver team questions. During Phase 3, finalists could deliver their presentations to judges in person, virtually using an online meeting tool, or by combining the two.Throughout the challenge, we got the support of a highly capable and engaged cross-functional team. Our strategic communications partners helped us reach a wide range of quality solver teams from multiple sectors and to make clear announcements at key milestone events. The GEAR Center Challenge team’s general counsel provided timely legal advice throughout the challenge to ensure we conducted a transparent and sound process. Our budget office helped us to efficiently award payments to the three grand-prize winners. We engaged the budget office early in the process and had solver teams fill out necessary paperwork at the Phase 2 stage so that we could begin this financial clearance process early. The Challenge.gov team shared expert advice and best practices that helped us navigate every step of the challenge process.
+ Throughout the challenge, we got the support of a highly capable and engaged cross-functional team. Our strategic communications partners helped us reach a wide range of quality solver teams from multiple sectors and to make clear announcements at key milestone events. The GEAR Center Challenge team’s general counsel provided timely legal advice throughout the challenge to ensure we conducted a transparent and sound process. Our budget office helped us to efficiently award payments to the three grand-prize winners. We engaged the budget office early in the process and had solver teams fill out necessary paperwork at the Phase 2 stage so that we could begin this financial clearance process early. The Challenge.gov team shared expert advice and best practices that helped us navigate every step of the challenge process.While we stayed on schedule all the way up to the Phase 3 finalist presentations, we had some delays while clearing challenge results with key stakeholders. We told finalists of the delays during this period and worked with our strategic communications partners and senior leaders to craft a clear announcement of challenge results. Once we completed the communications clearance process, we announced challenge results via multiple channels.
+ While we stayed on schedule all the way up to the Phase 3 finalist presentations, we had some delays while clearing challenge results with key stakeholders. We told finalists of the delays during this period and worked with our strategic communications partners and senior leaders to craft a clear announcement of challenge results. Once we completed the communications clearance process, we announced challenge results via multiple channels.The GEAR Center was conceived as a way to promote innovation in support of the PMA. The GEAR Center Challenge project ideas showed how innovative cross-sector partnerships can transform government mission delivery, service to citizens, and stewardship. By focusing the challenge on project idea proposals vs. solutions to specific problems, we were able to collect a diverse set of solver teams, given the broad scope and complexity of PMA topics. This approach also helped us to better understand the types of projects and cross-sector partnerships that a GEAR Center would be best suited for.
+ The GEAR Center was conceived as a way to promote innovation in support of the PMA. The GEAR Center Challenge project ideas showed how innovative cross-sector partnerships can transform government mission delivery, service to citizens, and stewardship. By focusing the challenge on project idea proposals vs. solutions to specific problems, we were able to collect a diverse set of solver teams, given the broad scope and complexity of PMA topics. This approach also helped us to better understand the types of projects and cross-sector partnerships that a GEAR Center would be best suited for.This competition was conducted by GSA under the authority of the America COMPETES Reauthorization Act of 2010 (15 U.S. Code § 3719) as amended by the American Innovation and Competitiveness Act of 2017.
+ This competition was conducted by GSA under the authority of the America COMPETES Reauthorization Act of 2010 (15 U.S. Code § 3719) as amended by the American Innovation and Competitiveness Act of 2017.GSA and OMB ran a successful prize competition on Challenge.gov. Leaders from across federal agencies acted as judges, with GSA also tapping subject-matter experts (SMEs) from Cross-Agency Priority (CAP) Goal teams and other government-wide initiatives.
+ GSA and OMB ran a successful prize competition on Challenge.gov. Leaders from across federal agencies acted as judges, with GSA also tapping subject-matter experts (SMEs) from Cross-Agency Priority (CAP) Goal teams and other government-wide initiatives. + Forty-nine eligible GEAR Center project proposals came in from solver teams representing:Cybersecurity Workforce Collaboration - Under this solution, a federal neurodiversity cyber workforce will be established to focus on training a particular federal agency to identify, hire, onboard, train, support, and retain neurodiverse individuals for cyber positions. This pilot program will be facilitated by winning team members including George Mason University, Mercyhurst University, Rochester Institute of Technology, University of Maryland, Drexel University, SAP, Specialisterne, DXC Dandelion Program, and the MITRE Corporation.
Data for Impact - Currently, data on federally funded workforce, education, and human services programs are too often held in silos that prevent local, state, and federal agencies from assessing the true impact of their joint service delivery. This solution, a collaboration between SkillSource Group and Third Sector Capital Partners, Inc., strives to improve government use of administrative data to measure impact. This team will pilot an approach to integrate currently disparate data that builds on existing state data integration efforts. The team will use many administrative data sources to measure the impact of Workforce Innovation and Opportunity Act (WIOA) services for Virginia Opportunity Youth with past involvement with the child welfare and/or criminal justice systems.
Data and Evidence for Government and Academic Impact - This project aims to help 250 federal practitioners in Kansas City by customizing an existing training curriculum and recommending how to replicate and scale it in other regions. This collaboration focuses on improving the use of evidence and data by the public sector workforce among the Johns Hopkins University Centers for Civic Impact, and Volcker Alliance's Government-to-University Initiative, and the Mid-America Regional Council.
+Unlocking the Value of Government Data - Deloitte, Google, University of Maryland, and Datawheel collaborate to create pop-up data marketplaces.
Delivering the Workforce of the 21st Century - Launchcode's initiative to re-skill individuals for high needs jobs.
@@ -83,7 +89,8 @@ title: Case Study - GEAR Center Challenge The Gear Center challenge team hosted a second webinar to provide semifinalists more information on expectations for their submissions and to help answer questions. A panel of three judges (different from P1) used Phase 2 (P2) criteria to evaluate 20 proposals and select 10 finalists to advance to the final round.Phase 3: Finalist Presentation
In the third phase, finalists presented their project proposals to a panel of federal executives, who were different from the judges from the first two phases.Teams presented their project proposals and engaged in an hour-long question-and-answer session with the judges. These sessions allowed judges to understand the innovation and the project feasibility, as well as whether the teams could deliver. The judges used Phase 3 criteria to evaluate presentations and select three grand-prize winning teams and five honorable-mention teams. The grand-prize winners each received $300,000, while no cash awards were given for honorable mention. -This network was asked at several points to:
@@ -95,18 +102,24 @@ title: Case Study - GEAR Center ChallengeAfter the challenge, NIST and a representative from the winning team testified about its purpose and outcomes before the House Subcommittee on Research and Technology. The hearing focused on Head Health Challenge III, which provided a catalyst for broader discussion of how prize competitions can be used to successfully address national priorities in science and technology.
America COMPETES Act
-https://ninesights.ninesigma.com/web/head-health -
+ @@ -97,4 +95,4 @@ title: Case Study - Head Health Challenge III - \ No newline at end of file + From b4ae2956a9a429d06f44286aef2562e36d6e24d0 Mon Sep 17 00:00:00 2001 From: Renata Bartlett <66442872+r-bartlett-gsa@users.noreply.github.com> Date: Fri, 12 Jul 2024 15:44:27 -0500 Subject: [PATCH 16/47] Update bridging-the-word-gap-challenge.md Fixed invalid link --- pages/toolkit/case-studies/bridging-the-word-gap-challenge.md | 4 ++-- 1 file changed, 2 insertions(+), 2 deletions(-) diff --git a/pages/toolkit/case-studies/bridging-the-word-gap-challenge.md b/pages/toolkit/case-studies/bridging-the-word-gap-challenge.md index 8d6171583..ca23dec9d 100644 --- a/pages/toolkit/case-studies/bridging-the-word-gap-challenge.md +++ b/pages/toolkit/case-studies/bridging-the-word-gap-challenge.md @@ -53,7 +53,7 @@ title: Case Study - Bridging the Word Gap ChallengeWith the guidance of the IDEA Lab and our Challenge Advisors, we built in multiple non-monetary incentives throughout the challenge to ensure that participants would continue throughout the three phases, and that the interventions would be continuously improved to result in the best iteration of the intervention by Phase 3.
We invited 10 external experts in the field to serve as voluntary Challenge Advisors. They represented content experts in academia, federal early childhood programs (the target audience), and technology experts. Initially, they offered insight into designing the challenge structure and the evaluation criteria for each phase, and in Phase 2 they served as one-on-one mentors for each Phase 1 winning team. The support, guidance, and insight into problem solving that they provided the teams from their own specific areas of expertise were invaluable.
In Phase 2, we supported travel for the nine teams to come to D.C. for Demo Day. At Demo Day, the nine teams were able to meet and learn from panelists including federal government staff who work in the innovation field, and also those from incubators and accelerators in the private sector, who gave information and advice on how to continue development and ensure broad reach of the interventions. While we funded the travel for the teams, the access to these thought leaders was a huge non-monetary incentive to continued participation in the challenge, as was connecting with each other and their advisors/mentors face-to-face.
-After announcing the final winner, we wanted to continue to catalyze the development of the winning intervention as well as the four other evidence-driven products that were advanced to Phase 3. We linked the winners from each phase to our Bridging the Word Gap Research Network, where they have access to partnership with researchers in this field, to continue to test the efficacy and build the evidence for the long-term impact of their interventions. We are currently continuing to support the cohort of semi-finalists, and are providing them with additional opportunities for connection, collaboration with federal programs and partners, and opportunities for further promotion, such as a recent webinar hosted for MCHB Home Visiting grantees in all states that highlighted the five teams with information on where to access their interventions.
+After announcing the final winner, we wanted to continue to catalyze the development of the winning intervention as well as the four other evidence-driven products that were advanced to Phase 3. We linked the winners from each phase to our Bridging the Word Gap Research Network, where they have access to partnership with researchers in this field, to continue to test the efficacy and build the evidence for the long-term impact of their interventions. We are currently continuing to support the cohort of semi-finalists, and are providing them with additional opportunities for connection, collaboration with federal programs and partners, and opportunities for further promotion, such as a recent webinar hosted for MCHB Home Visiting grantees in all states that highlighted the five teams with information on where to access their interventions.
Additionally, as part of a highly-visible federal initiative, the participants have been able to leverage other partnerships and opportunities. For example, due to participation in this challenge, the five semi-finalists have secured:
NIJ used the general NIJ statute that authorizes research, and the authority of 28 USC section 530C; America Competes Act was not the Authority for Challenges for NIJ at the time of the Ultra High-speed App Challenge.
- http://nij.gov/funding/pages/fy13-ultra-high-speed-apps-challenge.aspx -
s + NIJ Ultra-High Speed Apps Challenge: Using Current Technology to Improve Criminal Justice Operations + - \ No newline at end of file + From d75fa398c4be25335df860fdc6fdb79065886a76 Mon Sep 17 00:00:00 2001 From: Renata Bartlett <66442872+r-bartlett-gsa@users.noreply.github.com> Date: Fri, 12 Jul 2024 15:49:34 -0500 Subject: [PATCH 18/47] Update strain-measurement.md Removed invalid link --- pages/toolkit/case-studies/strain-measurement.md | 7 ++----- 1 file changed, 2 insertions(+), 5 deletions(-) diff --git a/pages/toolkit/case-studies/strain-measurement.md b/pages/toolkit/case-studies/strain-measurement.md index c0404f67a..90f8de319 100644 --- a/pages/toolkit/case-studies/strain-measurement.md +++ b/pages/toolkit/case-studies/strain-measurement.md @@ -46,10 +46,7 @@ title: Case Study - Strain MeasurementUnder the InnoCentive construct the Strain Measurement Challenge was a theoretical design challenge which requires a more detailed engineering design as part of the solution submission. This effort was perfect for this challenge type because the team had been working on the problem for over three years when they were given the opportunity to launch it as a public competition through the InnoCentive contract. The team was, like so many technical teams, resource constrained and had not had the opportunity to assign the level of resources required to find an internal solution to the problem. One of the lessons learned from this challenge, however, was that CoECI needed to improve the first time challenge owner's level of time commitment required to launch an external crowdsourced competition. It does require a resource investment, particularly in the evaluation phase. In the end, the team members were extremely pleased with the solutions they received and stated that the solutions provided were so simple and so elegant they were surprised they had not thought of them already.
Procurement Authority
-- https://www.innocentive.com/ar/challenge/9933145 -
+ @@ -57,4 +54,4 @@ title: Case Study - Strain Measurement - \ No newline at end of file + From 5b9dc84026a3b44e11fe39a4181374223893fc6f Mon Sep 17 00:00:00 2001 From: Renata Bartlett <66442872+r-bartlett-gsa@users.noreply.github.com> Date: Fri, 12 Jul 2024 15:52:54 -0500 Subject: [PATCH 19/47] Update future-engineers-3d-space-design.md Removed invalid image --- .../case-studies/future-engineers-3d-space-design.md | 11 ++--------- 1 file changed, 2 insertions(+), 9 deletions(-) diff --git a/pages/toolkit/case-studies/future-engineers-3d-space-design.md b/pages/toolkit/case-studies/future-engineers-3d-space-design.md index c5d90c57a..28fa2c39d 100644 --- a/pages/toolkit/case-studies/future-engineers-3d-space-design.md +++ b/pages/toolkit/case-studies/future-engineers-3d-space-design.md @@ -25,14 +25,7 @@ title: Case Study - Future Engineers 3D Space DesignThink Out of the Box Design Challenge: Students were challenged to design a useful object for astronauts on future space exploration that can be expanded or assembled to be larger than the available volume of a 3D printer.
Launch Date: April 16, 2016 End Date: Aug. 1, 2016
The Space Tool Design Challenge received 470 submissions from two age groups: 5 to 12 years old (junior engineers) and 13 to 19 years old (teen engineers). In the junior group, the 10 semifinalists received a 3D printing $50 gift certificate to allow entrants to print a design if they had no access to a 3D printer. The grand-prize winner of the junior group received a 3D printer for his or her school. The winning tool from the junior group was the Space Planter, which allows astronauts to grow plants with limited resources. The four runner-up contestants in the teen category received a trip to Los Angeles to visit SpaceX and Digital Domain, an Oscar winning visual effects studio. The winner of the teen group designed the Multi-Purpose Precision Maintenance tool that provides astronauts on the space station one tool that contains various features such as a wrench, socket, ruler, wire gauges and wire stripper without having to carry a number of different tools. The winner of this challenge will have his design printed on the International Space Station and discuss his design with astronauts aboard the station.
- - +The Space Container Design Challenge received 400 submissions from 36 states across two age groups: 5 to 12 years old (junior engineers) and 13 to 19 years old (teen engineers). In the junior group, the 10 semifinalists received a $50 3D printing gift certificate to allow entrants to print a design if they had no access to a 3D printer. The four runner-up contestants received a one-week scholarship to Space Camp in Huntsville, Alabama. The grand-prize winner received a 3D printer for his or her school and a private tour of the Space Shuttle Endeavor with an astronaut. The winning container from the junior group was the Flower Tea Cage, which uses the surface tension of liquids in a microgravity environment to allow astronauts to make tea. The winner of the teen group designed the ClipCatch, which will allow astronauts on the space station to clip their fingernails without worrying about the clippings floating away and potentially becoming harmful debris.
As of the writing of this case study the Star Trek Replicator Challenge and the Think Out of the Box Design Challenge are still underway.
Description
+The Wave Energy Prize is an 18-month design-build-test prize competition that aims to:
With more than 50 percent of the U.S. population living within 50 miles of coastlines, there is vast potential to provide clean, renewable electricity to communities and cities in the United States using wave energy. It is estimated that the technically recoverable wave energy resource is approximately 900-1,200 terawatt hours (TWh) per year. Developing just a small fraction of the available wave energy resource could allow for millions of American homes to be powered with this clean, reliable form of energy. For context, approximately 90,000 homes could be powered by 1 TWh per year. Extracting just 5 percent of the technical resource potential could result in wave energy powering 5 million American homes.
-Engagement
+The Wave Energy Prize has successfully mobilized both new and existing talent through the challenge, with engineers, developers and builders from across the country having thrown their hats in the ring. Participants include people who represent universities, small companies, more established players in wave energy and independent collaborations. Also, through the Marketplace, the competition has provided an online forum for external interested parties to collaborate with participating teams.
-Detailed prize structure
+The Wave Energy Prize is divided into three phases (design, build, and test) separated by four technology gates as shown and described below:
Design: For the first part of the design phase, participants were required to submit detailed
technical submissions describing their WEC concepts. The judging panel evaluated these submissions according to the Technology Performance Level rubric developed by the National Renewable Energy Laboratory and selected up to 20 qualified teams. These teams were then tasked with building 1/50th scale prototypes of their WEC concepts, numerically modeling their performance and developing detailed build plans for 1/20th scale prototypes. Qualified teams were required to test their 1/50th scale prototypes in 31 different sea states at one of five small-scale testing facilities (University of Iowa, University of Maine, University of Michigan, Stevens Institute of Technology and Oregon State University) across the country. The judging panel then evaluated device performance, numerical modeling results and build plans to select nine finalist and two alternate teams. The judging panel then selected up to ten finalists and two alternates from the qualified teams.
Build: Finalist and alternate teams were tasked with building 1/20th scale prototypes of their WEC concepts. Finalists were given up to $125,000 and alternates $25,000 to build these prototypes, with alternates becoming eligible for up to $125,000 were they to become finalists. Each team was paired with a data analyst from either the National Renewable Energy Laboratory or Sandia National Laboratories. Finalists and alternates worked with engineers at the Naval Surface Warfare Center Carderock Division and their assigned data analysts to come up with comprehensive test and evaluation plans to ensure a successful testing campaign at the Carderock Maneuvering and Seakeeping (MASK) Basin. The judging panel then selected nine finalist teams to proceed to testing at the MASK Basin starting August 2016.
Test: Each finalist team has been given one week on site at Carderock to prepare for testing and then one week of testing time in the MASK Basin. The test is to determine whether their 1/20th scale devices are double the state-of-the-art performance of WECs and thus eligible to win the grand prize. If a team becomes eligible to win the grand prize, their WEC device's performance will be further evaluated to account for other important energy capture, reliability and survivability metrics using proxy measurements collected during MASK Basin testing. The testing program will provide the Department of Energy (DOE) and investors with apples-to-apples comparisons of WEC device techno-economic performance when operating in real ocean conditions.
-Administrators and partners
+DOE's Water Power Program, along with a contracted prize administration team comprised of Ricardo, Inc., JZ Consulting, and Polaris Strategic Communications; technical experts from Sandia National Laboratories and the National Renewable Energy Laboratory; and staff at the Naval Surface Warfare Center Carderock Division are responsible for implementing the prize design, build, and test phases.
DOE also has partnered with various branches of the Department of the Navy to successfully execute the challenge. The Office of Naval Research has provided funds to develop the technologies and capabilities required to ensure fair and rigorous testing in the MASK Basin; the Naval Surface Warfare Center has provided in-kind support for the Judging Panel and reduced facility costs; and the Assistant Secretary of the Navy for Energy, Installations, and Environment has provided support to test three of the finalists in the MASK Basin.
The prize team also created a two-person independent expert review panel of prize and challenge experts—one from the White House Office of Science and Technology Policy and one formerly of the Defense Advanced Research Projects Agency—that, during go/no-go meetings for the challenge, provided guidance to the prize team on all aspects of the project, including testing program logistics, communications and outreach, and event planning.
-Incentives
+The Wave Energy Prize has provided a thoughtful package of incentives to attract developers to compete and allow them the opportunity to reach their full potential and meet the goal of doubling the state of the art performance:
Based on these requirements, the technical experts at the National Renewable Energy Laboratory and Sandia National Laboratories worked with DOE and the prize administration team to develop a new metric to measure the state-of-the-art performance of WECs and the ACE (Average Climate Capture Width per Characteristic Capital Expenditure). ACE represents the energy captured per unit structural cost of WECs. This is a proxy metric for LCOE. Just like LCOE is a cost-to-benefit metric ($/kWh), ACE is a benefit-to-cost metric that focuses on a key component that drives LCOE for WECs, namely structural cost. The denominator of ACE is a measure of the structural cost of the device, evaluated based on technical drawings, materials used and analytical load estimation on the structure device.
The state-of-the-art value for ACE is 1.5 meters per million dollars (1.5m/$M). A finalist becomes eligible to win the $1.5 million grand prize if they double ACE to 3m/$M during the final round of testing at the MASK Basin in Carderock.
The prize team still believed that ACE, while a significant step in the right direction for evaluating the performance of WECs, did not provide full confidence in WECs that could meet the true challenge of performing at reasonable cost in the open ocean. Thus, the prize team decided to evaluate teams eligible to win the grand prize according to a metric called Hydrodynamic Performance Quality (HPQ) which accounts for other important energy capture, reliability and survivability metrics using proxy measurements collected during MASK Basin testing. The teams that surpass the 3m/$M threshold will be ranked by their HPQ, and the team with the highest HPQ will win the $1.5 million grand prize.
- +Area of Excellence #2: "Obtain Agency Clearance"
The DOE prize team in the Water Power Program engaged with senior leadership from the very inception of the idea to run a prize competition. Office leadership was briefed during the development of the challenge, including on topics such as goals, rules, testing program, judging process and communications and outreach plan. General Counsel (GC) informed the development of the rules, as well as the terms and conditions.
The DOE prize team also worked with GC from the very beginning of drafting the funding opportunity announcement for a prize administration team. GC helped the team prepare the Federal Register Notice in time for the announcement of the launch of the prize competition.
Given that the prize administration team was selected under a financial assistance agreement, the DOE contracting officer has been engaged in each important phase, especially in the go/no-go decisions where the team performed a rigorous evaluation of the prize continuation application after the registration period closed and after the end of the 1/50th scale testing program. The DOE National Environmental Policy Act (NEPA) staff, also part of the prize team, have ensured that all actions taken by participants during the competition meet the requirements of NEPA.
- +Area of Excellence #3: "Execute the Communications Plan"
The Wave Energy Prize has used several approaches to successfully publicize the prize, mobilize potential participants and create a strong following for the competition, including the following:
With more than 50 percent of the U.S. population living within 50 miles of coastlines, there is vast potential to provide clean, renewable electricity to communities and cities in the United States using wave energy. It is estimated that the technically recoverable wave energy resource is approximately 900-1,200 terawatt hours (TWh) per year. Developing just a small fraction of the available wave energy resource could allow for millions of American homes to be powered with this clean, reliable form of energy. For context, approximately 90,000 homes could be powered by 1 TWh per year. Extracting just 5 percent of the technical resource potential could result in wave energy powering 5 million American homes.
The Wave Energy Prize has successfully mobilized both new and existing talent through the challenge, with engineers, developers and builders from across the country having thrown their hats in the ring. Participants include people who represent universities, small companies, more established players in wave energy and independent collaborations. Also, through the Marketplace, the competition has provided an online forum for external interested parties to collaborate with participating teams.
+The Wave Energy Prize has successfully mobilized both new and existing talent through the challenge, with engineers, developers and builders from across the country having thrown their hats in the ring. Participants include people who represent universities, small companies, more established players in wave energy and independent collaborations. Also, the competition has provided an online forum for external interested parties to collaborate with participating teams.
The Wave Energy Prize is divided into three phases (design, build, and test) separated by four technology gates as shown and described below:
Design: For the first part of the design phase, participants were required to submit detailed
@@ -59,15 +59,14 @@ title: Case Study - Wave Energy PrizeThe challenge is not yet complete, but there have been numerous successes so far in attracting new and existing players to wave energy; having teams successfully reach aggressive technical milestones; bringing forward innovations across a range of WEC device types; generating significant publicity; and building technical capacity to test WECs at testing facilities across the country.
With an aggressive communications and outreach strategy, 92 teams registered for the competition, three times more than expected. Of these, 66 turned in technical submissions, which were evaluated by a panel of expert judges to identify 20 qualified teams. Most teams that registered were not previously known to DOE. Seventeen of the 20 qualified teams completed the initial small-scale testing phase, and out of the nine finalists and two alternates, only two have received any funding from DOE in the past.
-Most of the teams have met the aggressive timelines for the challenge. For example, to meet the requirements for Technology Gate 2, the qualified teams built 1/50th scale model devices, tested them at university facilities around the country and conducted significant numerical modeling studies in just four months. As of June 2016, and as can be seen on the Team - Updates webpage, finalists and alternates have made significant progress in designing, building, and testing their 1/20th scale devices. This puts the challenge in a great position to achieve its remaining objectives.
+Most of the teams have met the aggressive timelines for the challenge. For example, to meet the requirements for Technology Gate 2, the qualified teams built 1/50th scale model devices, tested them at university facilities around the country and conducted significant numerical modeling studies in just four months. As of June 2016, finalists and alternates have made significant progress in designing, building, and testing their 1/20th scale devices. This puts the challenge in a great position to achieve its remaining objectives.
The finalists and alternates have put forward diverse WEC designs, which include two submerged areal absorbers, four point absorbers, two attenuators and three terminators. And in these designs, DOE is already seeing technical innovations in the areas of geometry, materials, power conversion and controls. Some of these include:
Further, teams have been required per the rules to communicate - publicly on the website about their progress and to speak about the challenge and their participation in their own words. Participants put a lot of blood, sweat and tears into a competition, and it is important to shed light on their stories and why they are participating.
+Further, teams have been required per the rules to communicate + publicly on the website about their progress and to speak about the challenge and their participation in their own words. Participants put a lot of blood, sweat and tears into a competition, and it is important to shed light on their stories and why they are participating.
Below are details on how the prize team has worked to ensure a successfully executed communications plan.
Strong relationships with different communications teams in DOE: The Wave Energy Prize has leveraged the fact that different communications teams within DOE have different audiences and outlets, and the prize team has tailored content to the audiences reached by different communications teams. Both the DOE prize team and the prize administration team have worked to understand how different communications teams in DOE from the Office of Energy Efficiency and Renewable Energy and DOE Public Affairs work and what their protocols are.
Synergistic communications plans and cross-promotion: The Wave Energy Prize is working off a communications and outreach plan in which DOE prize team communications and prize administration team communications are synergistic, and there is a clear delineation of the kinds of communications that come from each of the two teams. All communications that the DOE prize team and the prize administration team put out are planned one month in advance to ensure momentum. Media coverage, monthly newsletters and blogs and team features come from the prize administration team; synthesis and reflection pieces are written by the DOE prize team; and press releases at key stages of the competition are published by both teams. The prize administration team promotes all communications relating to the prize coming from any DOE office.
@@ -122,7 +121,7 @@ title: Case Study - Wave Energy PrizeThe goal of the Wave Energy Prize is to double the state-of-the-art performance of wave energy converters as measured through a metric called ACE, short for average climate capture width per characteristic capital expenditure—clearly a mouthful, and entirely jargon. ACE is a measure of the effectiveness of a WEC at absorbing power from the incident wave energy field divided by a measure of the capital expenditure in commercial production of the load bearing device structure. So how can that be said simply?
Here is how this metric is communicated to the public: "ACE is determined by dividing, in essence, the wave energy extraction efficiency of a wave energy converter by its structural cost." This language is simple enough that makes the key metric of the Wave Energy Prize understandable to many more people than just experts in wave energy. And the prize team has been directing those interested in more of the details to detailed blog posts with more technical depth.
Tracking impact and creating an archive: The prize administration team has tracked all key statistics pointing to the impact of communications efforts, including visitor numbers; click-through rates; time spent on webpages; geographic location; browser type; social media impressions on Facebook, LinkedIn and Twitter; and so on. The prize administration team provides monthly reports to the DOE prize team on these statistics, and changes to communications plans are made accordingly.
-Also, since Wave Energy Prize is unfolding over 18 months, the DOE prize team and prize administration team archive all outreach and media content on the Newsroom page of the competition website, creating a narrative arc for those interested.
+Also, since Wave Energy Prize is unfolding over 18 months, the DOE prize team and prize administration team archive all outreach and media content on the competition website, creating a narrative arc for those interested.
The DOE Wave Energy Prize won GSA Challenge.gov's Five Years of Excellence in Federal Challenge & Prize Competition Award for Best Challenge Engagement Strategy, and its communications plan was a key part of its success. The Wave Energy Prize has tracked analytics on its various social media platforms and its website, and developed a two-track communications plan, one executed by the contractor selected to administer the prize, and one executed by the program-level agency team.
Area of Excellence #4: "Accept Solutions"
Registration and eligibility: The Wave Energy Prize registration period started April 27, 2015, and the end date was extended from June 15, 2015, to June 30, 2015, to maximize the number of teams entering the prize funnel. During this extended window, registration jumped from 50 teams on June 15 to 92 on June 30.
@@ -143,7 +142,7 @@ title: Case Study - Wave Energy PrizeAmerica COMPETES Reauthorization Act
This challenge was a success on several important levels. First, it resulted in the identification of a software solution that met an important and emerging public health need and established an important precedent for engagement, innovation and collaboration between public health and the biotech startup community. Second, the competition elicited significant interest around the public health challenges posed by new non-culture-based clinical diagnostic assays and has helped to foster academic research around potential solutions. And third, it has helped to establish challenge competitions and public-private partnerships as key drivers for innovation at the agency.
America COMPETES Act
-http://www.cdc.gov/amd/achievements/cidtchallenge +
@@ -59,4 +58,4 @@ title: Case Study - No Petri Dish - \ No newline at end of file + From 3ecc9f6ed1a6ca31e0f9ce7dec4ae01e1764df74 Mon Sep 17 00:00:00 2001 From: Renata Bartlett <66442872+r-bartlett-gsa@users.noreply.github.com> Date: Fri, 12 Jul 2024 16:11:05 -0500 Subject: [PATCH 23/47] Update cdc-no-petri-dish.md --- pages/toolkit/case-studies/cdc-no-petri-dish.md | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/pages/toolkit/case-studies/cdc-no-petri-dish.md b/pages/toolkit/case-studies/cdc-no-petri-dish.md index fe4f3958f..d346c0acd 100644 --- a/pages/toolkit/case-studies/cdc-no-petri-dish.md +++ b/pages/toolkit/case-studies/cdc-no-petri-dish.md @@ -50,7 +50,7 @@ title: Case Study - No Petri DishAmerica COMPETES Act
- + From 1aa3c1548546dae1e2a5156f09ba350ad61acb17 Mon Sep 17 00:00:00 2001 From: Renata Bartlett <66442872+r-bartlett-gsa@users.noreply.github.com> Date: Fri, 12 Jul 2024 16:13:21 -0500 Subject: [PATCH 24/47] Update usaid-desal-prize.md --- pages/toolkit/case-studies/usaid-desal-prize.md | 13 +++---------- 1 file changed, 3 insertions(+), 10 deletions(-) diff --git a/pages/toolkit/case-studies/usaid-desal-prize.md b/pages/toolkit/case-studies/usaid-desal-prize.md index c56d9d4fa..bb3a547ec 100644 --- a/pages/toolkit/case-studies/usaid-desal-prize.md +++ b/pages/toolkit/case-studies/usaid-desal-prize.md @@ -136,14 +136,7 @@ title: Case Study - DESAL PrizeUSAID Assistance Authority
http://www.securingwaterforfood.org +
Securing Water for Food: a Grand Challange for Development
- - - - - - - - - \ No newline at end of file + + From 264751289658885113050be29ad35c2219b4ce1d Mon Sep 17 00:00:00 2001 From: Renata Bartlett <66442872+r-bartlett-gsa@users.noreply.github.com> Date: Fri, 12 Jul 2024 16:15:46 -0500 Subject: [PATCH 25/47] Update usaid-desal-prize.md --- pages/toolkit/case-studies/usaid-desal-prize.md | 3 +-- 1 file changed, 1 insertion(+), 2 deletions(-) diff --git a/pages/toolkit/case-studies/usaid-desal-prize.md b/pages/toolkit/case-studies/usaid-desal-prize.md index bb3a547ec..1f7d8f0a4 100644 --- a/pages/toolkit/case-studies/usaid-desal-prize.md +++ b/pages/toolkit/case-studies/usaid-desal-prize.md @@ -131,8 +131,7 @@ title: Case Study - DESAL PrizeUSAID Assistance Authority
USAID Assistance Authority
USAID Assistance Authority
Securing Water for Food: a Grand Challange for Development -
+Securing Water for Food: a Grand Challange for Development
From c2d51a435308e9a23cad7e8b42e0a0a777debd4e Mon Sep 17 00:00:00 2001 From: Renata Bartlett <66442872+r-bartlett-gsa@users.noreply.github.com> Date: Fri, 12 Jul 2024 16:22:29 -0500 Subject: [PATCH 28/47] Update usaid-desal-prize.md --- pages/toolkit/case-studies/usaid-desal-prize.md | 9 ++++++++- 1 file changed, 8 insertions(+), 1 deletion(-) diff --git a/pages/toolkit/case-studies/usaid-desal-prize.md b/pages/toolkit/case-studies/usaid-desal-prize.md index 6988d1605..5c0c9da71 100644 --- a/pages/toolkit/case-studies/usaid-desal-prize.md +++ b/pages/toolkit/case-studies/usaid-desal-prize.md @@ -138,4 +138,11 @@ title: Case Study - DESAL PrizeSecuring Water for Food: a Grand Challange for Development
- + + + + + + + + From 4fae295a77292fe6b660f2d4b30643469f307e8a Mon Sep 17 00:00:00 2001 From: Renata Bartlett <66442872+r-bartlett-gsa@users.noreply.github.com> Date: Fri, 12 Jul 2024 16:27:04 -0500 Subject: [PATCH 29/47] Update usaid-desal-prize.md --- pages/toolkit/case-studies/usaid-desal-prize.md | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/pages/toolkit/case-studies/usaid-desal-prize.md b/pages/toolkit/case-studies/usaid-desal-prize.md index 5c0c9da71..9c8e3692f 100644 --- a/pages/toolkit/case-studies/usaid-desal-prize.md +++ b/pages/toolkit/case-studies/usaid-desal-prize.md @@ -107,7 +107,7 @@ title: Case Study - DESAL PrizeIn addition, the Desal Prize was featured in blog posts and articles including the Water Desalination Report. The Desal Prize staff attended conferences and webinars to further enhance communications efforts.
Communications efforts were led by the USAID Global Development Lab Office of Communications, with support from a communications contractor.
Area of Excellence #6: "Accept Solutions"
-For the Phase-2 Technological Demonstration, semifinalist teams were required to submit the following via the platform found at http://thedesalprize.net:
+For the Phase-2 Technological Demonstration, semifinalist teams were required to submit the following:
Scientific
This challenge was a success on several important levels. First, it resulted in the identification of a software solution that met an important and emerging public health need and established an important precedent for engagement, innovation and collaboration between public health and the biotech startup community. Second, the competition elicited significant interest around the public health challenges posed by new non-culture-based clinical diagnostic assays and has helped to foster academic research around potential solutions. And third, it has helped to establish challenge competitions and public-private partnerships as key drivers for innovation at the agency.
America COMPETES Act
- - +America COMPETES Act
From 8fb76a3c09ed06acf1fe6a0f45a57d606a6c6653 Mon Sep 17 00:00:00 2001 From: Renata Bartlett <66442872+r-bartlett-gsa@users.noreply.github.com> Date: Thu, 26 Dec 2024 11:25:25 -0600 Subject: [PATCH 35/47] Update epa-nutrient-sensor.md --- pages/toolkit/case-studies/epa-nutrient-sensor.md | 5 +---- 1 file changed, 1 insertion(+), 4 deletions(-) diff --git a/pages/toolkit/case-studies/epa-nutrient-sensor.md b/pages/toolkit/case-studies/epa-nutrient-sensor.md index 0a076f73b..75e932f7e 100644 --- a/pages/toolkit/case-studies/epa-nutrient-sensor.md +++ b/pages/toolkit/case-studies/epa-nutrient-sensor.md @@ -55,9 +55,6 @@ title: Case Study - Nutrient Sensor ChallengeThe first event held by the challenge was the Nutrient Sensor Challenge Summit in August 2015. This was an opportunity for the 29 registered teams to convene to discuss, learn, network and demonstrate their abilities. The summit served as a great meeting point of the technology developers and the users. The market efforts were emphasized and networking was encouraged among the attendees. Following the summit, no-risk beta testing began. This phase of testing was an opportunity for the teams to take advantage of no-cost, no-risk laboratory and field testing as an important milestone towards final verification testing in 2016. Final verification testing began in May 2016 and will be held in aquatic ecosystems in three locations – Hawaii, Michigan and Maryland.
EPA Authority – Clean Water Act
-USAID Assistance Authority
-Securing Water for Food: a Grand Challange for Development
- +USAID Assistance Authority
From 1f50592416a2261ae03c94881923d27b7186a8ee Mon Sep 17 00:00:00 2001 From: Renata Bartlett <66442872+r-bartlett-gsa@users.noreply.github.com> Date: Thu, 26 Dec 2024 11:37:32 -0600 Subject: [PATCH 37/47] Update iarpa-instinct.md --- pages/toolkit/case-studies/iarpa-instinct.md | 4 ---- 1 file changed, 4 deletions(-) diff --git a/pages/toolkit/case-studies/iarpa-instinct.md b/pages/toolkit/case-studies/iarpa-instinct.md index 1c6de634a..56bc7820e 100644 --- a/pages/toolkit/case-studies/iarpa-instinct.md +++ b/pages/toolkit/case-studies/iarpa-instinct.md @@ -54,10 +54,6 @@ title: Case Study - INSTINCT challengeHow do you know if you can trust someone? The INSTINCT Challenge asked members of the American public to develop algorithms that improved predictions of trustworthiness using neural, physiological and behavioral data recorded during experiments in which volunteers made high-stakes promises and chose whether or not to keep them. Answering this question accurately is essential for society in general—but particularly so in the Intelligence Community (IC), where knowing whom to trust is often vital.
Procurement authority
-Up to 50 bonus points were awarded to any contestant that submitted forecasts for the 10 HHS Regions. Solutions submitted by the teams varied in format and complexity as no common standard existed for receiving and evaluating the accuracy of influenza forecasts, making comparison and interpretation by the judges difficult. CDC challenge management worked with the judges to provide a comprehensive and clear compilation of individual team forecasts. The challenges in data management and judging were noted, and standardized forecasting formats and accuracy assessments were developed for subsequent challenges.
Area of Excellence #3: "Document the Challenge"
Participating teams were invited to travel to Atlanta to present on their methodology and results and discuss lessons learned and the next steps, including participation in future forecasting challenges. Participating teams used the opportunity to share datasets and forecasting methodologies and provided valuable input that helped shape subsequent forecasting challenges.
-CDC also coordinated a scientific manuscript documenting the challenge, the results and the lessons learned to ensure that the information was captured and available to the public in the open-access, peer reviewed journalBioMed Central." Summarizing the challenge in a scientific manuscript was chosen because authorship provided an additional incentive to participating teams and allowed the results and conclusions of the challenge to be reviewed by experts in the field, increasing the credibility of the findings.
+CDC also coordinated a scientific manuscript documenting the challenge, the results and the lessons learned to ensure that the information was captured and available to the public in the open-access, peer reviewed journal BioMed Central." Summarizing the challenge in a scientific manuscript was chosen because authorship provided an additional incentive to participating teams and allowed the results and conclusions of the challenge to be reviewed by experts in the field, increasing the credibility of the findings.
Analytics
CDC hosted this challenge to spur innovation in the development of mathematical and statistical models to predict the timing, peak and intensity of the influenza season. This challenge required the development of forecasting models that used open-access data from existing CDC surveillance systems, including the U.S. Outpatient Influenza-like Illness Surveillance Network (ILINet) and Internet-derived data on influenza activity (e.g., Twitter data, Internet search term data, Internet-based surveys), which have been shown to have correlation with influenza activity.
Because of the various data sources utilized, the challenge encouraged a strong connection between forecasters, subject matter experts and public health decision makers. Forecasters needed support understanding the nuances of CDC's surveillance data while public health decision makers needed support understanding the different digital data sources and forecasting methodologies. This challenge identified a number of areas that need further research before forecasting can be routinely incorporated into decision-making, including the best metrics to assess forecast accuracy, the best way to communicate forecast uncertainty and the types of decisions best aided by forecasts. To help fill these research gaps, CDC has built upon the success of the original challenge to host additional challenges to predict subsequent influenza seasons.
America COMPETES Act
-- http://www.cdc.gov/flu/news/predict-flu-challenge.htm -
-- http://www.cdc.gov/flu/news/predict-flu-challenge-winner.htm -
From 6998486c3c1590464bf73007f7d6218e2d475247 Mon Sep 17 00:00:00 2001 From: Renata Bartlett <66442872+r-bartlett-gsa@users.noreply.github.com> Date: Thu, 26 Dec 2024 11:47:02 -0600 Subject: [PATCH 39/47] Update cpsc-carbon-monoxide.md --- pages/toolkit/case-studies/cpsc-carbon-monoxide.md | 3 --- 1 file changed, 3 deletions(-) diff --git a/pages/toolkit/case-studies/cpsc-carbon-monoxide.md b/pages/toolkit/case-studies/cpsc-carbon-monoxide.md index 256016bfc..5da492cf2 100644 --- a/pages/toolkit/case-studies/cpsc-carbon-monoxide.md +++ b/pages/toolkit/case-studies/cpsc-carbon-monoxide.md @@ -50,9 +50,6 @@ title: Case Study - Carbon Monoxide Poster ContestCPSC's carbon monoxide poster contest challenged students to create a work of art that not only looked appealing, but also had a strong educational message about a dangerous killer right in their own homes, carbon monoxide. Students were also challenged to show how CO could be prevented with carbon monoxide alarms and other safety measures.
America COMPETES Act
-http://www.cpsc.gov/en/Safety-Education/CO-Contest-2014/ -
From 42ab503001e440fea82862df2c06d0308e956a8c Mon Sep 17 00:00:00 2001 From: Renata Bartlett <66442872+r-bartlett-gsa@users.noreply.github.com> Date: Thu, 26 Dec 2024 11:51:10 -0600 Subject: [PATCH 40/47] Update nist-reference-data.md --- .../case-studies/nist-reference-data.md | 18 ------------------ 1 file changed, 18 deletions(-) diff --git a/pages/toolkit/case-studies/nist-reference-data.md b/pages/toolkit/case-studies/nist-reference-data.md index 0caf22284..0821a98f6 100644 --- a/pages/toolkit/case-studies/nist-reference-data.md +++ b/pages/toolkit/case-studies/nist-reference-data.md @@ -53,24 +53,6 @@ title: Case Study - Reference Data Challenge -NOTE: Feature is not available for challenges that redirect to an external site.
Who are you?
-My name is Tim K. Mackey, and I am the co-founder and CEO of [S-3 Research LLC](https://www.s-3.io/){:target="_blank"}. I’m essentially a researcher-turned-entrepreneur with the help of the U.S. government through the SUD Startup Challenge award and the [Small Business Innovation Research (SBIR) program](https://nida.nih.gov/funding/small-business-innovation-research-sbir-technology-transfer-sttr-programs){:target="_blank"}. I am also a current associate professor at UC San Diego where I teach and research on global health, health technology, and public policy. +My name is Tim K. Mackey, and I am the co-founder and CEO of [S-3 Research LLC](https://www.s-3.io/){:target="_blank"}. I’m essentially a researcher-turned-entrepreneur with the help of the U.S. government through the SUD Startup Challenge award and the Small Business Innovation Research (SBIR) program. I am also a current associate professor at UC San Diego where I teach and research on global health, health technology, and public policy. **What is the name of your company, where is it located, and what does it “do”?** @@ -54,7 +54,7 @@ The scourge of the opioid epidemic and its toll on society is real and acute. I **How did you hear about NIDA’s SUD Startup Challenge and why did you decide to apply?** -We heard about the challenge after participating in the HHS 2017 Opioid Code-a-Thon after being invited to form a team by a colleague formally at the U.S. Centers for Disease Control and Prevention. We were chosen as a finalist but didn’t win one of the three prizes, however, this gave us the opportunity to learn about the Challenge award and we applied.
+We heard about the challenge after participating in the HHS 2017 Opioid Code-a-Thon after being invited to form a team by a colleague formally at the U.S. Centers for Disease Control and Prevention. We were chosen as a finalist but didn’t win one of the three prizes, however, this gave us the opportunity to learn about the Challenge award and we applied.
From dba3693a6fe571a0059c1e3a537f158974496c25 Mon Sep 17 00:00:00 2001 From: Renata Bartlett <66442872+r-bartlett-gsa@users.noreply.github.com> Date: Thu, 26 Dec 2024 12:39:42 -0600 Subject: [PATCH 44/47] Update 2020-07-28-winner-q-a-with-tim-mackey-of-s-3-research-llc.md --- ...2020-07-28-winner-q-a-with-tim-mackey-of-s-3-research-llc.md | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/_posts/2020-07-28-winner-q-a-with-tim-mackey-of-s-3-research-llc.md b/_posts/2020-07-28-winner-q-a-with-tim-mackey-of-s-3-research-llc.md index 4f3b06ce3..5470c46c7 100644 --- a/_posts/2020-07-28-winner-q-a-with-tim-mackey-of-s-3-research-llc.md +++ b/_posts/2020-07-28-winner-q-a-with-tim-mackey-of-s-3-research-llc.md @@ -13,7 +13,7 @@ image: /assets/netlify-uploads/1.png image_alt_text: winner Q&A header image post-body-content-uploads: /assets/netlify-uploads/webp.net-resizeimage-1-.jpg --- -Recently, the [National Institute on Drug Abuse (NIDA)](https://www.nih.gov/about-nih/what-we-do/nih-almanac/national-institute-drug-abuse-nida){:target="_blank"}, one of the components of the National Institutes of Health (NIH), announced the winners of its fifth “$100,000 for Start a SUD Startup” Challenge, which was hosted on GSA’s Challenge.gov platform. The SUD Startup Challenge goal is to support research ideas that would further an understanding of substance use disorders (SUD) and that are intended to lay the foundation for the development of successful new startups. +Recently, the [National Institute on Drug Abuse (NIDA)](https://www.nih.gov/about-nih/what-we-do/nih-almanac/national-institute-drug-abuse-nida), one of the components of the National Institutes of Health (NIH), announced the winners of its fifth “$100,000 for Start a SUD Startup” Challenge, which was hosted on GSA’s Challenge.gov platform. The SUD Startup Challenge goal is to support research ideas that would further an understanding of substance use disorders (SUD) and that are intended to lay the foundation for the development of successful new startups. In anticipation of the announcement, Challenge.gov recently caught up with a past winner to learn more about their motivation to participate in NIDA’s annual competition, what they learned from the experience, what impact their prize-winning solution is making today, and what advice they have for the latest crop of winners. From 296aa4d9bd88dda9281945f0f58f3ded7abe4803 Mon Sep 17 00:00:00 2001 From: Renata Bartlett <66442872+r-bartlett-gsa@users.noreply.github.com> Date: Thu, 26 Dec 2024 13:45:11 -0600 Subject: [PATCH 45/47] Update resources.md --- pages/resources.md | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/pages/resources.md b/pages/resources.md index e40f997af..457f4367f 100644 --- a/pages/resources.md +++ b/pages/resources.md @@ -1,7 +1,7 @@ --- permalink: /resources/ layout: page -title: Resources +title: Resources for Public Innovators ---DHS intends for this work to be the first step in the design of a local and/or national-level system that could enable city-level operators to make critical and proactive decisions based on the most relevant and actionable insights. To form the basis for a proof of concept, the challenge focused on large metropolitan areas such as New York, Los Angeles, Washington D.C., Chicago, Boston, and Atlanta.
The challenge launched Oct. 17, 2017. DHS S&T used an interagency agreement with NASA's Center of - Excellence for Collaborative Innovation to contract with a third-party prize administrator, Luminary Labs. A website was established that provided access to additional information, newsletters, blogs, and other information about the challenge.
+ Excellence for Collaborative Innovation to contract with a third-party prize administrator, Luminary Labs. Hidden Signals Challenge website was established that provided access to additional information, newsletters, blogs, and other information about the challenge.The challenge received 37 submissions from entities, teams, and individuals from academia; artificial intelligence / machine learning and information technology fields; city and federal government; health technology and research fields; and research think tanks.
Stage 1 finalists:
@@ -135,4 +135,4 @@ title: Case Study - Hidden Signals Challenge