Skip to content

Latest commit

 

History

History
836 lines (799 loc) · 71.9 KB

1085-veterans-employment-challenge.md

File metadata and controls

836 lines (799 loc) · 71.9 KB
challenge-title layout permalink challenge-id status sidenav card-image agency-logo tagline agency partner-agencies-federal partners-non-federal external-url total-prize-offered-cash type-of-challenge submission-start submission-end fiscal-year legal-authority challenge-manager challenge-manager-email point-of-contact description prizes rules judging how-to-enter prize submission-link
Veterans' Employment Challenge
front-matter-data
/challenge/vets-match/
1085
closed
false
/assets/images/cards/Vets-Employment-Challenge-logoV5-bold-340x160.jpg
DOL-MasterLogo_Color300.png
Build a better application to match the skills of those who have served in the military with employer needs.
Department of Labor
Department of Defense, Department of Veterans Affairs
$1,000,000
Software
12/11/2019 10:30 AM
01/24/2020 5:00 PM
FY20
America COMPETES Act
Matthew Grob
<p>The goal of this challenge is to develop a job-matching tool that will be piloted in a <a href="https://www.dol.gov/agencies/vets/programs/tap" target="_blank" rel="noopener">Transition Assistance Program (TAP)</a> Employment Workshop at selected base locations. If the tool is effective, it could be added to the Employment Workshop curriculum for all transitioning service members.</p> <p>In 2018, the Department of Labor (DOL), the Department of Defense (DOD), and the Department of Veterans’ Affairs (VA) began a cross-agency effort to understand the transition of a Service member from active duty to civilian employment, from the perspective of the individual navigating this process—not the perspective of the Federal organizations that exist to support a plethora of programs. <a href="https://www.performance.gov/mapping-cx-journey/" target="_blank" rel="noopener">The agencies built a journeymap of this process and found a recurring theme:&nbsp; navigating the “sea of support” and resources relating to the job search can be a challenging task.</a></p> <p><img style="width:500px; float:right;" src="{{ site.baseurl }}/assets/images/challenge-content/Vets-Employment-Challenge-sealsV1.jpg" alt="Department of Veterans Affairs, Department of Labor, Department of Defense seals"></p> <p>Veterans complete their military service with unique and technical skillsets that bring value to all sectors of the economy, but can encounter challenges framing their experience and skills for civilian employers. American businesses report that they are also missing an opportunity to attract and retain a capable, competent workforce, and recognize the value in veterans and spouses.</p> <p>Many job boards, veteran hiring programs, and other initiatives exist, but there are disparate platforms of valuable data. Traversing across these resources can be overwhelming for both first-time job-seeking veterans and businesses with limited time to fill urgent needs. Additionally, most job search tools for Service members are based purely on military occupational specialties, which do not accurately represent the breadth of experiences many individuals have. Finally, small businesses in particular do not have the staff capacity to build out robust skills profiles that better describe hiring needs.</p> <p>There is a need for both a more sophisticated matching mechanism and a simpler interface that can pull from existing data sources (from Federal platforms like <a href="https://www.onetonline.org/" target="_blank" rel="noopener">O*NET</a>, the <a href="https://www.militaryonesource.mil/military-life-cycle/separation-transition/employment-education/credentialing-your-military-experience" target="_blank" rel="noopener">Credentialing Opportunities On-Line (COOL) program</a>, and the <a href="https://veterans.usnlx.com/" target="_blank" rel="noopener">National Labor Exchange</a>, to LinkedIn profiles, resumes, or job descriptions that individuals and businesses have created).</p> <p>The Government wants to be mindful of not prescribing what the platform should look like or how it will work, but wanted to note some of what job-seekers and Federal employees have shared throughout the research process of designing this competition. In particular, the agencies heard a need for:</p> <ul> <li>A matching-oriented application for the job-searching process. A platform that could determine likeness of fit, reveal matches, and allow both the employers and job-seekers to opt-in and share more information with each other and potentially interact more (connecting with Veterans at the company, chatting, etc.).</li> <li>A platform that has functions like a “ZocDoc” for job searching. Employers can self-input information (like doctors input their location, office hours, and information about their practice), external reviews from users are included, and users can actually take an action (book an appointment) in one place.</li> <li>A one-stop-shop verification system of employers. Employers on the platform could be matched automatically with databases that note if they are Federal contractors, have earned a <a href="https://www.hirevets.gov" target="_blank" rel="noopener">HIRE Vets Medallion</a>, or display information to help form a profile indicating whether the employer is who it claims to be (e.g., Google/Glassdoor/Yelp/Better Business Bureau/etc.).</li> </ul> <p>This competition is divided into phases that will enable Solvers to further engage users in their design process and gather feedback on working prototypes. The most important requirement of this competition is that the platform solves challenges faced by the actual users, understanding what’s most useful and helpful to them, rather than being tool-led.</p> <p>Here is some of what we heard about the gap that exists:</p> <h4>Job-Seeker Needs</h4> <ul> <li>Ability to filter opportunities that match skillsets based on personally important components such as benefits offered, location, and culture.</li> <li>Recommendations of opportunities outside occupational analysis, if necessary, to describe the skills and role in a familiar manner that Service members better understand.</li> <li>Confidence in individual privacy and data protections at the highest standard.</li> <li>Ability to “Turn on / Turn off” when you’re looking for opportunities.</li> <li>Ability to control what information is shared with employers of potentially “matched” opportunities (e.g., if you match with an opportunity, what level of information is shown to an employer to either contact you or learn more about you).</li> <li>Ability to take an action directly (such as API to apply for a job or contact an employer)—minimizing the need to update multiple platforms or re-enter data multiple times.</li> <li>Ability to connect with other veterans at a specific company.</li> <li>Ability to easily remove a particular profile and all related data from the platform.</li> </ul> <h4>Employer Needs</h4> <ul> <li>Present themselves—in particular, small business with limited HR capacities—through a profile that contains standardized information, as well as information about the organization.</li> <li>Post specific available opportunities in a manner that is designed to address service member and veteran-specific priorities, including:&nbsp; position descriptions, required skillsets, benefits, and work environment (e.g., support for Veterans, training, onboarding, mentorship opportunities). This could include options to connect a job-seeker directly to veterans currently employed at the company.</li> <li>Generate “match lists” of qualified candidates.</li> <li>Provide technology-enabled solutions to allow for direct outreach to candidates through mechanisms they regularly check (existing email accounts, text messages) rather than a separate message management system that requires additional login (only if candidates have set their profiles to allow for direct contact).</li> </ul> <h4>Federal Government Needs</h4> <ul> <li>Access to a feedback loop to learn more about matches made, types of users, most useful feature, and other raw data to inform Federal program efforts and ultimately track outcomes, including employment that is initiated beyond 180 days, wage rates, and/or retention statistics.</li> <li>Ability to fold in/link to existing government data efforts when they reach maturity.</li> </ul>
<p>The Department of Labor, with in-kind support through participation in testing and review from the Departments of Defense and Veterans Affairs, is offering a total prize pot of up to $1,000,000 awarded across four phases. The grand prize winner will be awarded up to $720,000.</p> <ul> <li>Semi-Finalist Prizes (5):&nbsp; $20,000 each</li> <li>Finalist Prizes (3):&nbsp; $100,000 each</li> <li>Pilot Prize (1):&nbsp; $300,000</li> <li>Grand Prize (1):&nbsp; $300,000</li> </ul> <h4>Phase I - Concept Paper Top 10 Scorers Advancing to Phase II (no cash prize at this Phase) in alphabetical order:</h4> <ul> <li>Acronis SCS</li> <li>Eightfold</li> <li>Enfuego</li> <li>JobPath Partners</li> <li>LinkedIn</li> <li>Pathfinder Labs</li> <li>Purepost</li> <li>Square Peg Hires</li> <li>Vantage Point</li> <li>Vet It</li> </ul> <h4>Phase II &ndash; Enhanced Wireframe Top 5 Scorers Advancing to Phase III (cash prize of $20,000 each) in alphabetical order:</h4> <ul> <li>Eightfold</li> <li>LinkedIn</li> <li>Square Peg Hires</li> <li>Vantage Point</li> <li>Vet It</li> </ul><h4>Phase III &ndash; MVP (Semi-Finals): Top 3 scorers advancing to Phase IV (case prize of $100,000 each) in alphabetical order:</h4> <ul><li>Eightfold.ai</li><li>LinkedIn</li><li>Square Peg Hires</li></ul> <h4>Phase IV&ndash; Field Testing (Finals): One top scoring Finalist advancing to Phase V (cash prize of $200,000):</h4> <ul><li>Eightfold.ai</li></ul> <h4>Phase V&ndash; Implementation Pilot: Finalist must receive an overall approval rating score of 80% or greater (cash prize of $300,000):</h4> <ul><li>Eightfold.ai</li></ul>
<h4>Eligibility to Participate</h4> <p>To be eligible to participate:</p> <ul> <li>Each Competition Solver (individual, team, or legal entity) is required to register through email participation in submitting the concept paper for Phase I as described in these competition rules.</li> <li>There shall be one Official Representative for each Competition Entry. The Official Representative must provide a name and email address and, by sending the first submission in Phase I, affirms on behalf of the Solver individual, team, or legal entity, that he or she has read and consents to be governed by the Competition Rules.</li> <li>As determined by DOL, any violation of these rules will be grounds for disqualification from the Competition. </li> <li>Multiple individuals and/or legal entities may collaborate as a Solver team to submit a single entry, in which case the designated Official Representative will be responsible for meeting all entry and evaluation requirements.</li> <li>Participation is subject to all U.S. Federal, state, local, and country of residence laws and regulations.</li> <li>Solvers must not be suspended, debarred, or otherwise excluded from doing business with the Federal Government. An individual or entity that is determined to be on the GSA Excluded Parties List (www.sam.gov) is ineligible to participate. Individuals entering on behalf of or representing a company, institution, or other legal entity are responsible for confirming that their entry does not violate any policies of that company, institution, or legal entity.</li> <li>As a condition of participation, all Solvers must agree to indemnify the Federal Government against third-party claims for damages arising from or related to competition activities. Entrants are not required to obtain liability insurance or demonstrate financial responsibility in order to participate in the competition. By entering the contest, Solvers agree to hold DOL harmless from all legal and administrative claims to include associated expenses that may arise from any claims related to their entry or its use.</li> <li>Federal grantees may not use Federal funds to develop competition solutions unless consistent with the purpose of their grant award.</li> <li>Federal contractors may not use Federal funds from a contract to develop competition applications or to fund efforts in support of a competition entry.</li> <li>Solvers may not be a Federal entity or Federal employee acting within the scope of their employment. Non-DOL, non-DOD, and non-VA Federal employees acting in their personal capacities should consult with their respective agency ethics officials to determine whether their participation in this Competition is permissible.</li> <li>DOL, DOD, and VA Federal employees are not eligible to participate in this Competition.</li> <li>Any other individuals or legal entities involved with the design, production, execution, distribution, or evaluation of this DOL Competition are not eligible to participate.</li> </ul> <hr> <h4>Eligibility to Win a Cash Prize</h4> <p>To be eligible for a cash prize:</p> <ul> <li>A Solver (whether an individual, team, or legal entity), through one Official Representative, must have registered to participate and complied with all requirements under section 3719 of title 15, United States Code, and the competition rules.</li> <li>At the time of Entry, the Official Representative (individual or team lead, in the case of a group project) must be age 18 or older. </li> <li>An individual, whether participating singly or in a group, shall be a citizen or permanent resident of the United States.</li> <li>In the case of a private entity that is participating as a Solver or as part of a Solver team, the business must be incorporated in and maintain a primary place of business in the United States or its territories.</li> <li>Entrants in this Competition agree, as a condition for winning a cash prize, to complete and submit all requested winner verification and payment documents to DOL within three business days of formal notification. Failure to return all required verification documents by the date specified in the notification may be a basis for disqualification of a cash prize winning entry.</li> <li>A Solver shall not be deemed ineligible because the Solver consulted with Federal employees or used Federal facilities in preparing its submission to this DOL Competition if the Federal employees and facilities are made available to all Solvers on an equitable basis.</li> </ul> <hr> <h4>Terms and Conditions</h4> <ul> <li>This Competition shall be performed in accordance with the America COMPETES Reauthorization Act of 2010, Pub. Law 111-358, Title I, § 105(a), Jan. 4, 2011, codified at 15 U.S.C. § 3719, as amended. </li> <li>All contests are subject to all applicable federal laws and regulations. Participation constitutes full and unconditional agreement to these Official Rules and administrative decisions, which are final and binding in all matters related to the contest.</li> <li>Eligibility for a prize award is contingent upon fulfilling all requirements set forth herein. This notice is not an obligation of funds; the final award of prizes is contingent upon the availability of appropriations and receiving suitable entries.</li> <li>This DOL Competition is voluntary and open to all entities that meet the eligibility requirements. There may be only one submission (“Submission”) per eligible entity. Submissions at each stage must be received by the deadline indicated. Submissions received after the deadline will not be considered.</li> <li>This Competition will be conducted in phases. Phases II and III include virtual or remote user testing evaluation. To maintain eligibility, Solvers selected to participate in these phases must participate in the user testing evaluation in order to be evaluated and advance to the next phase.</li> <li>Phases IV and V of this Competition include in-person user testing events. To maintain eligibility, Solvers selected to participate in these phases must participate in these in-person user testing events in order to be evaluated and to advance to the next phase and to receive the grand prize. Solvers are required to cover their own expenses to these in-person user testing events. </li> <li>A Competition entry constitutes an agreement to adhere to the competition rules and terms and conditions set forth by the contest sponsor, DOL.</li> <li>Solvers must meet the eligibility requirements described in the Eligibility section, to participate and/or win a cash prize.</li> <li>Any Solvers or entry found in violation of any rule will be disqualified at DOL’s sole discretion.</li> <li>Each individual or Solver team certifies, through entry to the contest, that the entry is his/her own original, creative work and does not violate or infringe upon the creative work of others, as protected under applicable intellectual property (IP) law.</li> <li>Each Solver certifies, through entry to the Competition, that any Submission by the Solver does not contain any harmful computer code (sometimes referred to as “malware,” “viruses,” or “worms”).</li> <li>By entering the Competition, the Solvers agree to hold DOL harmless from all legal and administrative claims to include associated expenses that may arise from any claims related to their entry or its use.</li> <li>All evaluation panel decisions are final and may not be appealed.</li> <li>All cash prizes awarded by DOL to Solvers are subject to tax liabilities, and no withholding will be assessed by DOL on behalf of the Solver claiming a cash prize. </li> <li>DOL reserves the right for any reason, including an insufficient number of qualified entries, to modify or cancel the Competition at any time during the duration of the competition.</li> <li>All Solvers agree that they, their heirs and estates agree to assume any and all risks and waive claims against the Federal Government and its related entities, except in the case of willful misconduct, for any injury, death, damage, or loss of property, revenue, or profits, whether direct, indirect, or consequential, arising from their participation in a prize competition, whether the injury, death, damage, or loss arises through negligence or otherwise. </li> <li>By participating in the Competition, each Solver agrees to comply with and abide by all DOL Challenge rules, terms, and conditions, and the decisions of DOL and/or the individual judges, which shall be final and binding in all respects.</li> </ul> <hr> <h4>Ownership</h4> <p>Any applicable ownership of IP in the submission will remain with the Solver. By participating in the Competition the Solver is not transferring any exclusive IP rights in applicable patents, pending patent applications, or copyrights in connection with the submission. However, by entering the submission, the Solver agrees to grant the Federal government (“Government”) certain license rights, as set forth in this section.</p> <p>Namely, the Solver grants the Government the right to review the submission, to publicly describe the submission in any materials created in connection with this competition, to screen and evaluate the submission, and to have the Judges, the Challenge administrators, and the designees of any of them review the submission. The Government is also granted the right to publicize Solver names and, as applicable, the names of Solver team members and/or organizations that participated in the submission following the conclusion of the competition.</p> <p>In addition, a Solver that receives a cash prize from this Challenge agrees to grant the Government the following license rights in the Submission:&nbsp; (1) a non-exclusive, non-transferable, irrevocable, paid-up, royalty-free license to practice or have practiced for or on the Government’s behalf, throughout the world, any invention created by the Solver that covers the Submission; and (2) a non-exclusive, non-transferable, irrevocable, paid-up, royalty-free license to reproduce, distribute publicly, prepare derivative works, and publicly perform and display the Submission by or on behalf of the Government, worldwide in any format, whether electronically or in print.</p> <p>By participating in the Challenge, each Solver (individual, team, or legal entity) represents and warrants that he or she is the sole author or owner of IP in connection with the Submission, or otherwise has the necessary rights to use the submission for purposes of the Challenge, including having any and all rights necessary to grant the license rights identified in this section. Each Solver further represents and warrants that the Submission does not infringe any copyright or any other rights of any third party of which the Solver is aware. To participate in the Challenge, each Solver warrants that there are no legal obstacles to granting the license rights of this section to the Government.</p> <p>The winners of a cash prize for the Challenge (collectively, "Winners") may be featured on Federal websites, in newsletters, in social media, and in other print and electronic outreach materials.</p> <p>Except where prohibited, participation in the contest constitutes the consent of each winner to the Government’s and its agents’ use of each winner’s name, likeness, photograph, voice, opinions, public summary, and/or hometown and state information for promotional purposes through any form of media, worldwide, without further permission, payment, or consideration. </p> <p>Finally, the Government will continue to communicate the resulting winner’s solution only if it continues to deliver mutual benefit to the job seekers, employers, and Government in achieving its mission to provide relevant services to these populations. In order to measure this and continue to improve programs and services, the Solver or Solver team must share platform use data with the Government.</p>
<h4>Advancement to Phase II</h4> <p>Evaluation Criteria:</p> <ul> <li><strong>Novelty (30%). </strong>Potential to revolutionize the job search process for transitioning service members/Veterans.</li> <li><strong>Needs Met / Potential for Impact (30%). </strong>Key evaluation factors shall include:&nbsp; superior matching capability which improves upon prior efforts, ease of use, and security and verification capabilities which improve upon prior efforts.</li> <li><strong>Technical Feasibility (30%). </strong>Integration of other efforts, data, and platforms. Key evaluation factors shall include the extent to which the proposal captures short- and long-term feasibility and sustainability.</li> <li><strong>Design quality (10%). </strong></li> </ul> <h4>Advancement to Phase III</h4> <p>Evaluation Criteria:</p> <ul> <li><strong> Impact / Needs Met (25%). </strong>Key evaluation factors shall include:&nbsp; superior matching capability which improves upon prior efforts, ease of use, and security and verification capabilities which improve upon prior efforts.</li> <li><strong>Technical Feasibility / Viability (25%). </strong> Integration of other efforts, data, and platforms. Key evaluation factors shall include the extent to which the proposal captures short- and long-term feasibility and sustainability.</li> <li><strong>Usability (25%). </strong> Recognition of user needs; may incorporate actual poll of testers</li> <li><strong>Design Quality (25%). </strong> Completeness, clarity of workflows, and quality</li> </ul> <h4>Advancement to Phase IV</h4> <p>Evaluation Criteria:</p> <ul> <li><strong>Functionality (40%). </strong>Key evaluation factors shall include a review of networking capabilities among user communities.</li> <li><strong>Analytical Depth/Integration (25%). </strong>Key evaluation factors shall include:&nbsp; security and verification, data integration and matching approaches, and platform used.</li> <li><strong>Usability / User Experience (20%). </strong>May incorporate actual poll of testers</li> <li><strong>Deployment Approach and Sustainability (15%).</strong></li> </ul> <h4>Advancement to Phase V</h4> <p>Evaluation Criteria:</p> <ul> <li>Judges will evaluate these criteria worth 50% of the score:</li> <ul> <li><strong>Functionality (20%). </strong>Key evaluation factors shall include a review of networking capabilities among user communities.</li> <li><strong>Quality of the Demonstration / Training (30%). </strong>Key evaluation factors shall include:&nbsp; data integration and matching approaches, and platform used.</li> </ul> <li>Users will evaluate these criteria worth 50% of the overall score:</li> <ul> <li><strong>User Experience (25%). </strong> Ease of use, look and feel</li> <li><strong>Net Promoter Score (10%). </strong> Scale of 1-10 (How likely would you be to recommend this tool to a friend?)</li> <li><strong>Task Success (15%). </strong> Users are given instructions to complete a task - metrics are collected to measure successful completion rate.</li> </ul> </ul> <h4>Successful Completion of the Challenge</h4> <p>Evaluation Criteria:</p> <ul> <li><strong>User Experience (35%). </strong> Ease of use, look and feel</li> <li><strong>Net Promoter Score (20%). </strong> Scale of 1-10 (How likely would you be to recommend this tool to a friend?)</li> <li><strong>Task Success (35%). </strong> Users are given instructions to complete a task - metrics are collected to measure successful completion rate.</li> <li><strong>Data Integration (10%). </strong></li> </ul> <p>Written feedback may be provided to all participants at the end of each phase. The purpose of providing this feedback is to provide insight on the Government’s application of the scoring criteria or identified needs from users. However, the Government will not respond to questions or inquiries regarding this feedback.</p>
<h4><strong>Phase I:&nbsp; </strong>Proposal</h4> <p><strong>Open Date:&nbsp; </strong>December 11, 2019</p> <p><strong>Duration:&nbsp; </strong>6 weeks</p> <p><strong>Close Date:&nbsp; </strong>January 24, 2020 – Deadline to submit at 5:00 PM EST</p> <p><strong>Submit:&nbsp; </strong>Concept paper</p> <ul> <li>Entries must consist of PDF files with font size no smaller than 11-point Arial.</li> <li>All submissions must be in English.</li> <li>Solvers must not use DOL, VA, DOD, or other government logos or official seals in the submissions and must not otherwise give an appearance of Federal government endorsement.</li> <li>Submission details to <a href="mailto:[email protected]" target="_blank">[email protected]</a>:</li> <ul> <li>Title the email subject line “Veterans Employment Challenge Phase I Proposal”.</li> <li>Email sender (person and email address) must be Official Representative/person of contact for the team.</li> <li>Following documents should be attached:</li> <ul> <li>Proposal cover sheet as PDF</li> <ul> <li>Solver Official Representative (person of contact) contact information (full name, email, phone)</li> <li>Name of organization(s) and all team member(s) submitting entry</li> </ul> <li>Two-page MAX Concept description document as PDF</li> <li>Wireframe sketch:&nbsp; The wireframe sketch should be included as either an attachment or included as a link in the body of the email Submission to a clickable website. A one-page document description of the wireframe sketch should be included with the Submission.</li></ul> </ul> </ul> <p><strong>Description:&nbsp; </strong>The concept paper phase invites all eligible entities to submit a concept white paper outlining their technology, the potential impact for job seekers and employers, the innovative idea / concept behind how their platform could work and leverage existing efforts, and how this is informed by user needs they have identified. Each concept and wireframe will be reviewed by a tri-agency panel. Only the first two pages (inclusive of any visual representations or graphics) of the concept description document will be reviewed along with the wireframe sketch submission and its one page description, if any. During the Phase I review process, the concept paper and wireframe sketch will be redacted for both applicant and any specific organizational partner names referenced (versus broader categories of organizational partners) and reviewed by Federal evaluators screened to avoid conflicts of interest. Department of Labor (DOL) will provide additional criteria throughout the various phases to further define requirement expectations.</p> <p><strong>Prize:&nbsp; </strong>Up to 10 applicants with the highest ranking white papers will advance to Phase II.</p> <p><strong>Notification date of advancement of Semi-Finalists to Phase II:&nbsp; </strong>January 31, 2020 by 5:00 PM EST</p> <hr> <h4>Phase II:&nbsp; Enhanced Wireframe:&nbsp; User-Centered Testing/Concept Optimization</h4> <p><strong>Open Date:&nbsp; </strong>February 3, 2020</p> <p><strong>Duration:&nbsp; </strong>4 Weeks</p> <p><strong>Close Date:&nbsp; </strong>February 28, 2020 – Deadline to submit at 5:00 PM EST</p> <p><strong>Submit:&nbsp; </strong>Wireframe (document or link)</p> <ul> <li>All submissions must be in English.</li> <li>Solvers must not use DOL, VA, DOD, or other government logos or official seals in the submissions and must not otherwise give an appearance of Federal government endorsement.</li> <li>Submission details to <a href="mailto:[email protected]" target="_blank" rel="noopener">[email protected]</a>:&nbsp; </li> <ul> <li>The sender must be the same Official Representative who submitted the entry for the Solver team in Phase I.</li> <li>Title the email subject line “Veterans Employment Challenge Phase II Wireframe”.</li> <li>The wireframe must be included as either an attachment or a link in the body of an email to a clickable website.</li> <li>The wireframe can be accompanied by a document (no more than 5 pages if a document, or could be built into the wireframe) summarizing how user feedback in the rapid testing sessions was incorporated into the overall design.</li> </ul> </ul> <p><strong>Description:&nbsp; </strong>The applicants selected from Phase I (up to 10 Solvers) will be invited to participate in “rapid testing” session with actual users (employers, Service members, and Veterans) the week of February 21. Solvers will have an opportunity to present their wireframe to receive feedback from these three user groups, in order to update a final wireframe submission at the end of this phase. Solvers will also be provided with sample data sets. Solvers selected to participate in this phase must participate in the rapid testing in order to be evaluated and advance to the next phase.</p> <p><strong>Prize:&nbsp; </strong>Up to 5 awardees will advance to the next phase and receive $20,000 each.</p> <p><strong>Notification date of advancement of Semi-Finalists to Phase III:&nbsp; </strong>March 6, 2020 by 5:00 PM EST</p> <hr> <h4><strong>Phase III:&nbsp; </strong>MVP (Semi-Finals)</h4> <p><strong>Open Date:&nbsp; </strong>March 9, 2020</p> <p><strong>Duration:&nbsp; </strong>8 Weeks</p> <p><strong>Close Date:&nbsp; </strong>May 1, 2020 – Deadline to submit at 5:00 PM EST</p> <p><strong>Submit:&nbsp; </strong>MVP (link)</p> <ul> <li>Submission details to <a href="mailto:[email protected]" target="_blank" rel="noopener">[email protected]</a>:</li> <ul> <li>The sender should be the same Official Representative who submitted the entry for the Solver team in Phases I and II.</li> <li>The MVP must be a workable link.</li> <li>The MVP can be accompanied by a document (no more than 5 pages if so, or could be built into the MVP) summarizing how user feedback in the rapid testing sessions was incorporated into the overall design.</li> <li>The Solver should also include a one-page overview of the business/pricing model of the product.</li> </ul> </ul> <p><strong>Description:&nbsp; </strong>The Phase II awardees (up to 5 Solvers) will be invited to participate to test their MVPs with actual users (employers, Service members, and Veterans) in a virtual user testing session the week of March 2 (exact dates TBD). Solvers should bring to this event a clickable MVP to receive feedback from these three user groups, in order to update a final MVP submission at the end of this phase. Solvers selected to participate in this phase must participate in the user testing in order to be evaluated and advance to the next phase.</p> <p><strong>Prize:&nbsp; </strong>Up to 3 awardees will advance to the next phase and receive $100,000 each. </p> <p><strong>Notification date of advancement of Finalists to Phase IV:&nbsp; </strong>May 8, 2020 by 5:00 PM EST</p> <hr> <h4><strong>Phase IV:&nbsp; </strong>Field Testing (Finals)</h4> <p><strong>Open Date:&nbsp; </strong>May 11, 2020</p> <p><strong>Duration:&nbsp; </strong>8 Weeks</p> <p><strong>Close Date:&nbsp; </strong>July 3, 2020 – Deadline to submit at 5:00 PM EST</p> <p><strong>Submit:&nbsp; </strong>Alpha Product Version, 30-minute TAP module</p> <ul> <li>Submission details to <a href="mailto:[email protected]" target="_blank" rel="noopener">[email protected]</a>:</li> <ul> <li>The sender should be the same Official Representative that submitted the entry for the Solver team in Phases I-III.</li> <li>Title the email subject line “Veterans Employment Challenge Phase IV Pilot Materials”.</li> <li>The product should be fully functional for job seekers and employers.</li> </ul> </ul> <p><strong>Description:&nbsp; </strong>The Phase III awardees (up to 3 Solvers) will be invited to participate in small-scale pilot to test their product with Service members participating in TAP sessions during the weeks of June 15 and June 22 (exact dates TBD). Solvers will have 30 minutes in these sessions to present their tool, demo, and provide the opportunity for Service members to test the tool / observe their reactions and use. Following these demos and delivery, Solvers will have one week to update their final product before submission. Solvers selected to participate in this phase must participate in the user testing in order to be evaluated and advance to the next phase. Solvers must cover their own expenses to the user testing event.</p> <p><strong>Prize:&nbsp; </strong>1 Finalist will advance to the next phase and receive a prize of $300,000. </p> <p><strong>Notification date of advancement of selected Finalist to Phase V:</strong> July 10, 2020 by 5:00 PM EST</p> <hr> <h4>Phase V:&nbsp; Implementation Pilot </h4> <p><strong>Open Date:&nbsp; </strong>July 13, 2020</p> <p><strong>Duration:&nbsp; </strong>8 Weeks</p> <p><strong>Close Date:&nbsp; </strong>September 4, 2020 – Deadline to submit at 5:00 PM EST</p> <p><strong>Submit:&nbsp; </strong>Beta Product Version, 30-minute TAP module</p> <ul> <li>Submission details to <a href="mailto:[email protected]" target="_blank" rel="noopener">[email protected]</a>:</li> <ul> <li>The sender should be the same Official Representative who submitted the entry for the Solver team in Phases I-IV.</li> <li>Title the email subject line “Veterans Employment Challenge Phase V Final Materials”.</li> <li>The product must be fully functional for job seekers and employers, with incorporated improvements from additional piloting.</li> </ul> </ul> <p><strong>Description:&nbsp; </strong>The selected Phase IV Finalist will be invited to participate in a series of implementation pilots to further test and refine their product with Service members participating in TAP sessions during the weeks of July 27 – August 21 (exact dates TBD), and this may include the test of a large-scale pilot employer (e.g., Federal government agency). The Finalist will have 30 minutes in these sessions to present their tool, demo, and provide the opportunity for Service members to test the tool / observe their reactions and use. Following these demos and delivery, the Finalist will have two weeks to update their final product before submission. The overall goal of this final round is for the Finalist to develop a tool that transitioning Service members will use. In order to claim the $300,000 grand prize the tool must receive an overall approval rating score of 80% or greater (calculated from the evaluation criteria). The raters for this round will be Service members, instructors, and employers participating in the training sessions. The selected Finalist must cover the Finalist’s own expenses to the user testing event and participate in the user testing event in order to be eligible for the grand prize.</p> <p><strong>Prize:&nbsp; </strong>Grand Prize winner may receive a prize of $300,000.</p> <p><strong>Notification date of Grand Prize Winner:&nbsp; </strong>September 18, 2020 by 5:00 PM EST</p>
true

Description

The goal of this challenge is to develop a job-matching tool that will be piloted in a Transition Assistance Program (TAP) Employment Workshop at selected base locations. If the tool is effective, it could be added to the Employment Workshop curriculum for all transitioning service members.

In 2018, the Department of Labor (DOL), the Department of Defense (DOD), and the Department of Veterans’ Affairs (VA) began a cross-agency effort to understand the transition of a Service member from active duty to civilian employment, from the perspective of the individual navigating this process—not the perspective of the Federal organizations that exist to support a plethora of programs. The agencies built a journeymap of this process and found a recurring theme: navigating the “sea of support” and resources relating to the job search can be a challenging task.

Department of Veterans Affairs, Department of Labor, Department of Defense seals

Veterans complete their military service with unique and technical skillsets that bring value to all sectors of the economy, but can encounter challenges framing their experience and skills for civilian employers. American businesses report that they are also missing an opportunity to attract and retain a capable, competent workforce, and recognize the value in veterans and spouses.

Many job boards, veteran hiring programs, and other initiatives exist, but there are disparate platforms of valuable data. Traversing across these resources can be overwhelming for both first-time job-seeking veterans and businesses with limited time to fill urgent needs. Additionally, most job search tools for Service members are based purely on military occupational specialties, which do not accurately represent the breadth of experiences many individuals have. Finally, small businesses in particular do not have the staff capacity to build out robust skills profiles that better describe hiring needs.

There is a need for both a more sophisticated matching mechanism and a simpler interface that can pull from existing data sources (from Federal platforms like O*NET, the Credentialing Opportunities On-Line (COOL) program, and the National Labor Exchange, to LinkedIn profiles, resumes, or job descriptions that individuals and businesses have created).

The Government wants to be mindful of not prescribing what the platform should look like or how it will work, but wanted to note some of what job-seekers and Federal employees have shared throughout the research process of designing this competition. In particular, the agencies heard a need for:

  • A matching-oriented application for the job-searching process. A platform that could determine likeness of fit, reveal matches, and allow both the employers and job-seekers to opt-in and share more information with each other and potentially interact more (connecting with Veterans at the company, chatting, etc.).
  • A platform that has functions like a “ZocDoc” for job searching. Employers can self-input information (like doctors input their location, office hours, and information about their practice), external reviews from users are included, and users can actually take an action (book an appointment) in one place.
  • A one-stop-shop verification system of employers. Employers on the platform could be matched automatically with databases that note if they are Federal contractors, have earned a HIRE Vets Medallion, or display information to help form a profile indicating whether the employer is who it claims to be (e.g., Google/Glassdoor/Yelp/Better Business Bureau/etc.).

This competition is divided into phases that will enable Solvers to further engage users in their design process and gather feedback on working prototypes. The most important requirement of this competition is that the platform solves challenges faced by the actual users, understanding what’s most useful and helpful to them, rather than being tool-led.

Here is some of what we heard about the gap that exists:

Job-Seeker Needs

  • Ability to filter opportunities that match skillsets based on personally important components such as benefits offered, location, and culture.
  • Recommendations of opportunities outside occupational analysis, if necessary, to describe the skills and role in a familiar manner that Service members better understand.
  • Confidence in individual privacy and data protections at the highest standard.
  • Ability to “Turn on / Turn off” when you’re looking for opportunities.
  • Ability to control what information is shared with employers of potentially “matched” opportunities (e.g., if you match with an opportunity, what level of information is shown to an employer to either contact you or learn more about you).
  • Ability to take an action directly (such as API to apply for a job or contact an employer)—minimizing the need to update multiple platforms or re-enter data multiple times.
  • Ability to connect with other veterans at a specific company.
  • Ability to easily remove a particular profile and all related data from the platform.

Employer Needs

  • Present themselves—in particular, small business with limited HR capacities—through a profile that contains standardized information, as well as information about the organization.
  • Post specific available opportunities in a manner that is designed to address service member and veteran-specific priorities, including: position descriptions, required skillsets, benefits, and work environment (e.g., support for Veterans, training, onboarding, mentorship opportunities). This could include options to connect a job-seeker directly to veterans currently employed at the company.
  • Generate “match lists” of qualified candidates.
  • Provide technology-enabled solutions to allow for direct outreach to candidates through mechanisms they regularly check (existing email accounts, text messages) rather than a separate message management system that requires additional login (only if candidates have set their profiles to allow for direct contact).

Federal Government Needs

  • Access to a feedback loop to learn more about matches made, types of users, most useful feature, and other raw data to inform Federal program efforts and ultimately track outcomes, including employment that is initiated beyond 180 days, wage rates, and/or retention statistics.
  • Ability to fold in/link to existing government data efforts when they reach maturity.

Prizes

The Department of Labor, with in-kind support through participation in testing and review from the Departments of Defense and Veterans Affairs, is offering a total prize pot of up to $1,000,000 awarded across four phases. The grand prize winner will be awarded up to $720,000.

  • Semi-Finalist Prizes (5): $20,000 each
  • Finalist Prizes (3): $100,000 each
  • Pilot Prize (1): $300,000
  • Grand Prize (1): $300,000

Phase I - Concept Paper Top 10 Scorers Advancing to Phase II (no cash prize at this Phase) in alphabetical order:

  • Acronis SCS
  • Eightfold.ai
  • Enfuego
  • JobPath Partners
  • LinkedIn
  • Pathfinder Labs
  • Purepost
  • Square Peg Hires
  • Vantage Point
  • Vet It

Phase II – Enhanced Wireframe Top 5 Scorers Advancing to Phase III (cash prize of $20,000 each) in alphabetical order:

  • Eightfold.ai
  • LinkedIn
  • Square Peg Hires
  • Vantage Point
  • Vet It

Phase III – MVP (Semi-Finals): Top 3 scorers advancing to Phase IV (case prize of $100,000 each) in alphabetical order:

  • Eightfold.ai
  • LinkedIn
  • Square Peg Hires

Phase IV– Field Testing (Finals): One top scoring Finalist advancing to Phase V (cash prize of $200,000):

  • Eightfold.ai

Phase V– Implementation Pilot: Finalist must receive an overall approval rating score of 80% or greater (cash prize of $300,000):

  • Eightfold.ai

Rules

Eligibility to Participate

To be eligible to participate:

  • Each Competition Solver (individual, team, or legal entity) is required to register through email participation in submitting the concept paper for Phase I as described in these competition rules.
  • There shall be one Official Representative for each Competition Entry. The Official Representative must provide a name and email address and, by sending the first submission in Phase I, affirms on behalf of the Solver individual, team, or legal entity, that he or she has read and consents to be governed by the Competition Rules.
  • As determined by DOL, any violation of these rules will be grounds for disqualification from the Competition.
  • Multiple individuals and/or legal entities may collaborate as a Solver team to submit a single entry, in which case the designated Official Representative will be responsible for meeting all entry and evaluation requirements.
  • Participation is subject to all U.S. Federal, state, local, and country of residence laws and regulations.
  • Solvers must not be suspended, debarred, or otherwise excluded from doing business with the Federal Government. An individual or entity that is determined to be on the GSA Excluded Parties List (www.sam.gov) is ineligible to participate. Individuals entering on behalf of or representing a company, institution, or other legal entity are responsible for confirming that their entry does not violate any policies of that company, institution, or legal entity.
  • As a condition of participation, all Solvers must agree to indemnify the Federal Government against third-party claims for damages arising from or related to competition activities. Entrants are not required to obtain liability insurance or demonstrate financial responsibility in order to participate in the competition. By entering the contest, Solvers agree to hold DOL harmless from all legal and administrative claims to include associated expenses that may arise from any claims related to their entry or its use.
  • Federal grantees may not use Federal funds to develop competition solutions unless consistent with the purpose of their grant award.
  • Federal contractors may not use Federal funds from a contract to develop competition applications or to fund efforts in support of a competition entry.
  • Solvers may not be a Federal entity or Federal employee acting within the scope of their employment. Non-DOL, non-DOD, and non-VA Federal employees acting in their personal capacities should consult with their respective agency ethics officials to determine whether their participation in this Competition is permissible.
  • DOL, DOD, and VA Federal employees are not eligible to participate in this Competition.
  • Any other individuals or legal entities involved with the design, production, execution, distribution, or evaluation of this DOL Competition are not eligible to participate.

Eligibility to Win a Cash Prize

To be eligible for a cash prize:

  • A Solver (whether an individual, team, or legal entity), through one Official Representative, must have registered to participate and complied with all requirements under section 3719 of title 15, United States Code, and the competition rules.
  • At the time of Entry, the Official Representative (individual or team lead, in the case of a group project) must be age 18 or older.
  • An individual, whether participating singly or in a group, shall be a citizen or permanent resident of the United States.
  • In the case of a private entity that is participating as a Solver or as part of a Solver team, the business must be incorporated in and maintain a primary place of business in the United States or its territories.
  • Entrants in this Competition agree, as a condition for winning a cash prize, to complete and submit all requested winner verification and payment documents to DOL within three business days of formal notification. Failure to return all required verification documents by the date specified in the notification may be a basis for disqualification of a cash prize winning entry.
  • A Solver shall not be deemed ineligible because the Solver consulted with Federal employees or used Federal facilities in preparing its submission to this DOL Competition if the Federal employees and facilities are made available to all Solvers on an equitable basis.

Terms and Conditions

  • This Competition shall be performed in accordance with the America COMPETES Reauthorization Act of 2010, Pub. Law 111-358, Title I, § 105(a), Jan. 4, 2011, codified at 15 U.S.C. § 3719, as amended.
  • All contests are subject to all applicable federal laws and regulations. Participation constitutes full and unconditional agreement to these Official Rules and administrative decisions, which are final and binding in all matters related to the contest.
  • Eligibility for a prize award is contingent upon fulfilling all requirements set forth herein. This notice is not an obligation of funds; the final award of prizes is contingent upon the availability of appropriations and receiving suitable entries.
  • This DOL Competition is voluntary and open to all entities that meet the eligibility requirements. There may be only one submission (“Submission”) per eligible entity. Submissions at each stage must be received by the deadline indicated. Submissions received after the deadline will not be considered.
  • This Competition will be conducted in phases. Phases II and III include virtual or remote user testing evaluation. To maintain eligibility, Solvers selected to participate in these phases must participate in the user testing evaluation in order to be evaluated and advance to the next phase.
  • Phases IV and V of this Competition include in-person user testing events. To maintain eligibility, Solvers selected to participate in these phases must participate in these in-person user testing events in order to be evaluated and to advance to the next phase and to receive the grand prize. Solvers are required to cover their own expenses to these in-person user testing events.
  • A Competition entry constitutes an agreement to adhere to the competition rules and terms and conditions set forth by the contest sponsor, DOL.
  • Solvers must meet the eligibility requirements described in the Eligibility section, to participate and/or win a cash prize.
  • Any Solvers or entry found in violation of any rule will be disqualified at DOL’s sole discretion.
  • Each individual or Solver team certifies, through entry to the contest, that the entry is his/her own original, creative work and does not violate or infringe upon the creative work of others, as protected under applicable intellectual property (IP) law.
  • Each Solver certifies, through entry to the Competition, that any Submission by the Solver does not contain any harmful computer code (sometimes referred to as “malware,” “viruses,” or “worms”).
  • By entering the Competition, the Solvers agree to hold DOL harmless from all legal and administrative claims to include associated expenses that may arise from any claims related to their entry or its use.
  • All evaluation panel decisions are final and may not be appealed.
  • All cash prizes awarded by DOL to Solvers are subject to tax liabilities, and no withholding will be assessed by DOL on behalf of the Solver claiming a cash prize.
  • DOL reserves the right for any reason, including an insufficient number of qualified entries, to modify or cancel the Competition at any time during the duration of the competition.
  • All Solvers agree that they, their heirs and estates agree to assume any and all risks and waive claims against the Federal Government and its related entities, except in the case of willful misconduct, for any injury, death, damage, or loss of property, revenue, or profits, whether direct, indirect, or consequential, arising from their participation in a prize competition, whether the injury, death, damage, or loss arises through negligence or otherwise.
  • By participating in the Competition, each Solver agrees to comply with and abide by all DOL Challenge rules, terms, and conditions, and the decisions of DOL and/or the individual judges, which shall be final and binding in all respects.

Ownership

Any applicable ownership of IP in the submission will remain with the Solver. By participating in the Competition the Solver is not transferring any exclusive IP rights in applicable patents, pending patent applications, or copyrights in connection with the submission. However, by entering the submission, the Solver agrees to grant the Federal government (“Government”) certain license rights, as set forth in this section.

Namely, the Solver grants the Government the right to review the submission, to publicly describe the submission in any materials created in connection with this competition, to screen and evaluate the submission, and to have the Judges, the Challenge administrators, and the designees of any of them review the submission. The Government is also granted the right to publicize Solver names and, as applicable, the names of Solver team members and/or organizations that participated in the submission following the conclusion of the competition.

In addition, a Solver that receives a cash prize from this Challenge agrees to grant the Government the following license rights in the Submission: (1) a non-exclusive, non-transferable, irrevocable, paid-up, royalty-free license to practice or have practiced for or on the Government’s behalf, throughout the world, any invention created by the Solver that covers the Submission; and (2) a non-exclusive, non-transferable, irrevocable, paid-up, royalty-free license to reproduce, distribute publicly, prepare derivative works, and publicly perform and display the Submission by or on behalf of the Government, worldwide in any format, whether electronically or in print.

By participating in the Challenge, each Solver (individual, team, or legal entity) represents and warrants that he or she is the sole author or owner of IP in connection with the Submission, or otherwise has the necessary rights to use the submission for purposes of the Challenge, including having any and all rights necessary to grant the license rights identified in this section. Each Solver further represents and warrants that the Submission does not infringe any copyright or any other rights of any third party of which the Solver is aware. To participate in the Challenge, each Solver warrants that there are no legal obstacles to granting the license rights of this section to the Government.

The winners of a cash prize for the Challenge (collectively, "Winners") may be featured on Federal websites, in newsletters, in social media, and in other print and electronic outreach materials.

Except where prohibited, participation in the contest constitutes the consent of each winner to the Government’s and its agents’ use of each winner’s name, likeness, photograph, voice, opinions, public summary, and/or hometown and state information for promotional purposes through any form of media, worldwide, without further permission, payment, or consideration.

Finally, the Government will continue to communicate the resulting winner’s solution only if it continues to deliver mutual benefit to the job seekers, employers, and Government in achieving its mission to provide relevant services to these populations. In order to measure this and continue to improve programs and services, the Solver or Solver team must share platform use data with the Government.

Judging Criteria

Advancement to Phase II

Evaluation Criteria:

  • Novelty (30%). Potential to revolutionize the job search process for transitioning service members/Veterans.
  • Needs Met / Potential for Impact (30%). Key evaluation factors shall include: superior matching capability which improves upon prior efforts, ease of use, and security and verification capabilities which improve upon prior efforts.
  • Technical Feasibility (30%). Integration of other efforts, data, and platforms. Key evaluation factors shall include the extent to which the proposal captures short- and long-term feasibility and sustainability.
  • Design quality (10%).

Advancement to Phase III

Evaluation Criteria:

  • Impact / Needs Met (25%). Key evaluation factors shall include: superior matching capability which improves upon prior efforts, ease of use, and security and verification capabilities which improve upon prior efforts.
  • Technical Feasibility / Viability (25%). Integration of other efforts, data, and platforms. Key evaluation factors shall include the extent to which the proposal captures short- and long-term feasibility and sustainability.
  • Usability (25%). Recognition of user needs; may incorporate actual poll of testers
  • Design Quality (25%). Completeness, clarity of workflows, and quality

Advancement to Phase IV

Evaluation Criteria:

  • Functionality (40%). Key evaluation factors shall include a review of networking capabilities among user communities.
  • Analytical Depth/Integration (25%). Key evaluation factors shall include: security and verification, data integration and matching approaches, and platform used.
  • Usability / User Experience (20%). May incorporate actual poll of testers
  • Deployment Approach and Sustainability (15%).

Advancement to Phase V

Evaluation Criteria:

  • Judges will evaluate these criteria worth 50% of the score:
    • Functionality (20%). Key evaluation factors shall include a review of networking capabilities among user communities.
    • Quality of the Demonstration / Training (30%). Key evaluation factors shall include: data integration and matching approaches, and platform used.
  • Users will evaluate these criteria worth 50% of the overall score:
    • User Experience (25%). Ease of use, look and feel
    • Net Promoter Score (10%). Scale of 1-10 (How likely would you be to recommend this tool to a friend?)
    • Task Success (15%). Users are given instructions to complete a task - metrics are collected to measure successful completion rate.

Successful Completion of the Challenge

Evaluation Criteria:

  • User Experience (35%). Ease of use, look and feel
  • Net Promoter Score (20%). Scale of 1-10 (How likely would you be to recommend this tool to a friend?)
  • Task Success (35%). Users are given instructions to complete a task - metrics are collected to measure successful completion rate.
  • Data Integration (10%).

Written feedback may be provided to all participants at the end of each phase. The purpose of providing this feedback is to provide insight on the Government’s application of the scoring criteria or identified needs from users. However, the Government will not respond to questions or inquiries regarding this feedback.

How To Enter

Phase I: Proposal

Open Date: December 11, 2019

Duration: 6 weeks

Close Date: January 24, 2020 – Deadline to submit at 5:00 PM EST

Submit: Concept paper

  • Entries must consist of PDF files with font size no smaller than 11-point Arial.
  • All submissions must be in English.
  • Solvers must not use DOL, VA, DOD, or other government logos or official seals in the submissions and must not otherwise give an appearance of Federal government endorsement.
  • Submission details to [email protected]:
    • Title the email subject line “Veterans Employment Challenge Phase I Proposal”.
    • Email sender (person and email address) must be Official Representative/person of contact for the team.
    • Following documents should be attached:
      • Proposal cover sheet as PDF
        • Solver Official Representative (person of contact) contact information (full name, email, phone)
        • Name of organization(s) and all team member(s) submitting entry
      • Two-page MAX Concept description document as PDF
      • Wireframe sketch: The wireframe sketch should be included as either an attachment or included as a link in the body of the email Submission to a clickable website. A one-page document description of the wireframe sketch should be included with the Submission.

Description: The concept paper phase invites all eligible entities to submit a concept white paper outlining their technology, the potential impact for job seekers and employers, the innovative idea / concept behind how their platform could work and leverage existing efforts, and how this is informed by user needs they have identified. Each concept and wireframe will be reviewed by a tri-agency panel. Only the first two pages (inclusive of any visual representations or graphics) of the concept description document will be reviewed along with the wireframe sketch submission and its one page description, if any. During the Phase I review process, the concept paper and wireframe sketch will be redacted for both applicant and any specific organizational partner names referenced (versus broader categories of organizational partners) and reviewed by Federal evaluators screened to avoid conflicts of interest. Department of Labor (DOL) will provide additional criteria throughout the various phases to further define requirement expectations.

Prize: Up to 10 applicants with the highest ranking white papers will advance to Phase II.

Notification date of advancement of Semi-Finalists to Phase II: January 31, 2020 by 5:00 PM EST


Phase II: Enhanced Wireframe: User-Centered Testing/Concept Optimization

Open Date: February 3, 2020

Duration: 4 Weeks

Close Date: February 28, 2020 – Deadline to submit at 5:00 PM EST

Submit: Wireframe (document or link)

  • All submissions must be in English.
  • Solvers must not use DOL, VA, DOD, or other government logos or official seals in the submissions and must not otherwise give an appearance of Federal government endorsement.
  • Submission details to [email protected]:
    • The sender must be the same Official Representative who submitted the entry for the Solver team in Phase I.
    • Title the email subject line “Veterans Employment Challenge Phase II Wireframe”.
    • The wireframe must be included as either an attachment or a link in the body of an email to a clickable website.
    • The wireframe can be accompanied by a document (no more than 5 pages if a document, or could be built into the wireframe) summarizing how user feedback in the rapid testing sessions was incorporated into the overall design.

Description: The applicants selected from Phase I (up to 10 Solvers) will be invited to participate in “rapid testing” session with actual users (employers, Service members, and Veterans) the week of February 21. Solvers will have an opportunity to present their wireframe to receive feedback from these three user groups, in order to update a final wireframe submission at the end of this phase. Solvers will also be provided with sample data sets. Solvers selected to participate in this phase must participate in the rapid testing in order to be evaluated and advance to the next phase.

Prize: Up to 5 awardees will advance to the next phase and receive $20,000 each.

Notification date of advancement of Semi-Finalists to Phase III: March 6, 2020 by 5:00 PM EST


Phase III: MVP (Semi-Finals)

Open Date: March 9, 2020

Duration: 8 Weeks

Close Date: May 1, 2020 – Deadline to submit at 5:00 PM EST

Submit: MVP (link)

  • Submission details to [email protected]:
    • The sender should be the same Official Representative who submitted the entry for the Solver team in Phases I and II.
    • The MVP must be a workable link.
    • The MVP can be accompanied by a document (no more than 5 pages if so, or could be built into the MVP) summarizing how user feedback in the rapid testing sessions was incorporated into the overall design.
    • The Solver should also include a one-page overview of the business/pricing model of the product.

Description: The Phase II awardees (up to 5 Solvers) will be invited to participate to test their MVPs with actual users (employers, Service members, and Veterans) in a virtual user testing session the week of March 2 (exact dates TBD). Solvers should bring to this event a clickable MVP to receive feedback from these three user groups, in order to update a final MVP submission at the end of this phase. Solvers selected to participate in this phase must participate in the user testing in order to be evaluated and advance to the next phase.

Prize: Up to 3 awardees will advance to the next phase and receive $100,000 each.

Notification date of advancement of Finalists to Phase IV: May 8, 2020 by 5:00 PM EST


Phase IV: Field Testing (Finals)

Open Date: May 11, 2020

Duration: 8 Weeks

Close Date: July 3, 2020 – Deadline to submit at 5:00 PM EST

Submit: Alpha Product Version, 30-minute TAP module

  • Submission details to [email protected]:
    • The sender should be the same Official Representative that submitted the entry for the Solver team in Phases I-III.
    • Title the email subject line “Veterans Employment Challenge Phase IV Pilot Materials”.
    • The product should be fully functional for job seekers and employers.

Description: The Phase III awardees (up to 3 Solvers) will be invited to participate in small-scale pilot to test their product with Service members participating in TAP sessions during the weeks of June 15 and June 22 (exact dates TBD). Solvers will have 30 minutes in these sessions to present their tool, demo, and provide the opportunity for Service members to test the tool / observe their reactions and use. Following these demos and delivery, Solvers will have one week to update their final product before submission. Solvers selected to participate in this phase must participate in the user testing in order to be evaluated and advance to the next phase. Solvers must cover their own expenses to the user testing event.

Prize: 1 Finalist will advance to the next phase and receive a prize of $300,000.

Notification date of advancement of selected Finalist to Phase V: July 10, 2020 by 5:00 PM EST


Phase V: Implementation Pilot

Open Date: July 13, 2020

Duration: 8 Weeks

Close Date: September 4, 2020 – Deadline to submit at 5:00 PM EST

Submit: Beta Product Version, 30-minute TAP module

  • Submission details to [email protected]:
    • The sender should be the same Official Representative who submitted the entry for the Solver team in Phases I-IV.
    • Title the email subject line “Veterans Employment Challenge Phase V Final Materials”.
    • The product must be fully functional for job seekers and employers, with incorporated improvements from additional piloting.

Description: The selected Phase IV Finalist will be invited to participate in a series of implementation pilots to further test and refine their product with Service members participating in TAP sessions during the weeks of July 27 – August 21 (exact dates TBD), and this may include the test of a large-scale pilot employer (e.g., Federal government agency). The Finalist will have 30 minutes in these sessions to present their tool, demo, and provide the opportunity for Service members to test the tool / observe their reactions and use. Following these demos and delivery, the Finalist will have two weeks to update their final product before submission. The overall goal of this final round is for the Finalist to develop a tool that transitioning Service members will use. In order to claim the $300,000 grand prize the tool must receive an overall approval rating score of 80% or greater (calculated from the evaluation criteria). The raters for this round will be Service members, instructors, and employers participating in the training sessions. The selected Finalist must cover the Finalist’s own expenses to the user testing event and participate in the user testing event in order to be eligible for the grand prize.

Prize: Grand Prize winner may receive a prize of $300,000.

Notification date of Grand Prize Winner: September 18, 2020 by 5:00 PM EST