Jacobs Technology, Inc. | US GAO – Government Accountability Office

Jacobs Technology, Inc. (Jacobs), of Tullahoma, Tennessee, protests the award of a contract to Sierra Lobo, Inc. (SLI), under request for proposals (RFP) No. 80JSC020R0026, issued by the National Aeronautics and Space Administration (NASA), for testing and operations support at the Johnson Space Center’s White Sands Test Facility in New Mexico. The protester contends that the agency’s evaluation of the offerors’ past performance and technical proposals was flawed.
DOCUMENT FOR PUBLIC RELEASE
The decision issued on the date below was subject to a GAO Protective Order. This redacted version has been approved for public release.
Decision
Matter of: Jacobs Technology, Inc.
File: B-420016; B-420016.2
Date: October 28, 2021
Brian P. Waagner, Esq., Steven A. Neeley, Esq., Michael J. Schrier, Esq., Maya Desai, Esq., and Leah C. Kaiser, Esq., Husch Blackwell LLP, for the protester.
Douglas P. Hibshman, Esq., Reginald M. Jones, Esq., and Michael A. Hordell, Esq., Fox Rothschild LLP, for Sierra Lobo, Inc., the intervenor.
Cody Corley, Esq., Ian F. Rothfuss, Esq., and Jaewon Choi, Esq., National Aeronautics and Space Administration, for the agency.
Michael P. Grogan, Esq., and Edward Goldstein, Esq., Office of the General Counsel, GAO, participated in the preparation of the decision.
DIGEST
Protest challenging the agency’s technical and past performance evaluations is denied where the record shows that the evaluations were reasonable and consistent with the terms of the solicitation, and any errors were not materially prejudicial.
DECISION
Jacobs Technology, Inc. (Jacobs), of Tullahoma, Tennessee, protests the award of a contract to Sierra Lobo, Inc. (SLI), under request for proposals (RFP) No. 80JSC020R0026, issued by the National Aeronautics and Space Administration (NASA), for testing and operations support at the Johnson Space Center’s White Sands Test Facility in New Mexico. The protester contends that the agency’s evaluation of the offerors’ past performance and technical proposals was flawed.
We deny the protest.
BACKGROUND
The agency issued the solicitation on August 17, 2020, pursuant to the procedures in Federal Acquisition Regulation (FAR) part 15, seeking continued test and operations support at the Johnson Space Center’s White Sands Test Facility and other identified locations. Agency Report (AR), Tab 2.05, RFP amend. 4, Performance Work Statement (PWS) at 2026-2027.[1] Specifically, this requirement, designated as the Test, Evaluation, and Support Team 3 (TEST3) contract, seeks contractor support for: propulsion testing; propellants, aerospace fluids, materials, and components testing; hypervelocity impact testing; flight hardware processing; technical services; training; facility maintenance and operation; and construction management. AR, Tab 2.01, RFP at 0015. The RFP anticipated the award of a single indefinite-delivery, indefinite-quantity contract–with task orders issued on a hybrid cost-plus-award-fee and fixed-price basis–with a 5-year period of performance. AR, Tab 2.06, RFP amend. 5 at 2329.
The solicitation provided for award on a best-value tradeoff basis, considering three factors: (1) mission suitability; (2) past performance; and (3) cost/price. Id. at 2498-2499. Mission suitability had four subfactors: (A) management approach (MA); (B) technical approach (TA); (C) innovations & efficiencies approach (IE); and (D) small business utilization (SBU). Id. at 2499. These mission suitability subfactors, in turn, each included sub-elements.[2] The MA subfactor included the following sub-elements: (i) contract management approach (MA1); (ii) safety and health approach (MA2); (iii) total compensation plan (MA3); (iv) staffing and critical skills plan (MA4); and (v) phase-in plan (MA5). Id. at 2500. The TA subfactor included sub-elements corresponding to an offeror’s technical understanding and resources for specific task orders: (i) base work task order (TA1); (ii) human landing system commercial lunar descent stage – vacuum hot fire test task order (TA2); and (iii) Mars ascent vehicle flow control valve test support task order (TA3).[3] Id. Mission suitability, as a whole, was evaluated on a 1,000-point scale, based on the relative importance of the underlying subfactors: the MA subfactor was worth 425 points; the TA subfactor was worth 350 points; the IE subfactor was worth 125 points; and the SBU subfactor was worth 100 points. Id. at 2499.
Past performance would be evaluated for recency, relevance, and performance. Id. at 2502. Recent performance included work completed within five years of the solicitation, and NASA would use the following adjectival rating scheme to assess relevancy: very relevant; relevant; somewhat relevant; and not relevant. Id. at 2503. When assessing performance, the agency would apply one of the following ratings: excellent; very good; good; satisfactory; poor/unsatisfactory; and not applicable. Id. at 2504. NASA would then assign an overall past performance confidence rating–considering recency, relevance, and performance, as a whole–with the following possible rating combinations: very high; high; moderate; low; very low; and neutral.[4] Id. at 2504-2505. Concerning cost/price, NASA would evaluate whether offerors’ costs/prices were reasonable and realistic. Id. at 2506-2507. The solicitation advised that mission suitability and past performance, when combined, were more important than cost/price, mission suitability was approximately equal to past performance, and mission suitability was more important than cost/price. Id. at 2499.
Following the submission of proposals, NASA established a competitive range that included only Jacobs and SLI. AR, Tab 3.02, Competitive Range Determination. Following discussions with both offerors, NASA received final proposal revisions from Jacobs and SLI on April 30. See generally AR, Tab 5, Final Proposal Revisions. The following is a summary of the agency’s final ratings of SLI and Jacobs:[5]
SLI
Jacobs
Mission Suitability Score (out of 1000)
847
794
Management Approach (MA)
Very Good (345)
Good (298)
Technical Approach (TA)
Excellent (333)
Very Good (308)
Innovations & Efficiencies (IE)
Good (69)
Good (88)
Small Business Utilizations (SBU)
Excellent (100)
Excellent (100)
Past Performance Confidence
Very High
Very High
Most Probable Cost/Price[6]
$246.9M
$251.8M
AR, Tab 7.02, Source Selection Statement (SSS) at 5496-5497.
Following its evaluation, NASA selected SLI as representing the best value to the agency. SLI’s proposal received a score of 847 points under the mission suitability factor, a rating of very high under the past performance confidence factor, and was determined to have a probable cost of $246.9 million. Id. at 5496. In contrast, Jacobs’s proposal received a score of 794 points under the mission suitability factor, a rating of very high under the past performance confidence factor, and was determined to have a probable cost of $251.8 million. Id. The source selection authority (SSA) stated that while past performance was not a discriminator between offerors, SLI’s proposal was more advantageous under the mission suitability factor and offered a lower price, and therefore represented a better value to the agency. Id. at 5510. The agency made award to SLI on July 12. Following its debriefing on July 14, Jacobs filed the instant protest on July 26.
DISCUSSION
Jacobs raises several challenges to the agency’s selection of SLI for award. First, the protester alleges that the agency’s past performance evaluation was flawed because NASA: (1) improperly credited SLI for experience in a manner that did not comport with the solicitation’s guidelines; and (2) unreasonably equated offerors’ prior experience. Protest at 13-18; Comments and Supp. Protest at 2-14; Supp. Comments at 4-9. Jacobs also challenges NASA’s evaluation under the mission suitability factor, alleging unequal treatment, the application of unstated evaluation criteria, and a failure to follow the evaluation requirements outlined in the RFP. Protest at 18-25; Comments and Supp. Protest at 14-27; Supp. Comments at 9-16. For the following reasons, we find no basis on which to sustain the protest.[7]
Past Performance
Jacobs contends that the agency, contrary to the terms of the solicitation, gave SLI credit for two contracts performed by a joint venture of which SLI was a member. Comment and Supp. Protest at 3-6; Supp. Comments at 4-7. In addition, the protester argues that the agency improperly equated Jacobs’s and SLI’s past performance by overstating the value of SLI’s references and understating the relevancy of several of Jacobs’s references. Comments and Supp. Protest at 6-14; Supp. Comments at 7-9. The agency contends its evaluation was reasonable and consistent with the solicitation’s evaluation criteria.
When a protester challenges an agency’s evaluation of past performance, we will review the evaluation to determine if it was reasonable and consistent with the solicitation’s evaluation criteria and procurement statutes and regulations, and to ensure that it is adequately documented. Falcon Envtl. Servs., Inc., B-402670, B-402670.2, July 6, 2010, 2010 CPD ¶ 160 at 7. An agency’s evaluation of past performance, which includes its consideration of the relevance, scope, and significance of an offeror’s performance history, is a matter of discretion, which we will not disturb unless the agency’s assessment is unreasonable or inconsistent with the solicitation criteria. Metropolitan Life Ins. Co., B-412717, B‑412717.2, May 13, 2016, 2016 CPD ¶ 132 at 14. A protester’s disagreement with the agency’s past performance judgements, without more, is insufficient to establish that the evaluation was improper. Beretta USA Corp., B-406376.2, B-406376.3, July 12, 2013, 2013 CPD ¶ 186 at 10.
Here, the solicitation instructed offerors to identify up to seven contracts for the past performance evaluation. AR, Tab 2.06, RFP amend. 5 at 2474, 2503-2504. SLI identified six references, two of which were for work SLI performed as part of joint ventures–the Test Facilities Operation, Maintenance, and Engineering II (TFOME II) contract and Test Operations and Sustainment (TOS) contract. AR, Tab 5.01, SLI Final Proposal Revision at 4256, 4266. NASA deemed both contracts recent, assigned each a rating of relevant, and found that the contracts warranted a performance rating of excellent. AR, Tab 6.01, SLI Final Evaluation at 5304.
Jacobs argues that the agency unreasonably credited SLI for these contracts “without giving any consideration to what portion of SLI’s actual work contributed” to the excellent performance rating. Supp. Comments at 5. The protester argues that the agency’s evaluation runs contrary to the terms of the solicitation, which required NASA to examine the extent to which each offeror or team member performed the underlying work of a submitted reference.[8] Id. at 6. The agency counters that the solicitation explicitly instructed that offerors with a history of past performance as part of a joint venture would be evaluated based on all of the work performed by the joint venture. In support of its position, the agency relies on the section of the RFP stating that “Offerors’ with past performance as part of a joint venture will be evaluated on the entirety of the work performed by the joint venture.” AR, Tab 2.06, RFP amend. 5 at 2503. To the extent Jacobs disagreed with that approach, NASA argues, it was required to raise objections to the RFP’s language prior to the due date for proposals. Memorandum of Law at 10-12. In response, the protester contends the language cited by NASA was limited to the agency’s evaluation of relevancy–it did not pertain to an evaluation of the quality of the performance–and that the proposal submission instructions outlined in the section L.23 of the RFP required NASA to consider what portions of a past performance reference were actually performed by an offeror. Comments and Supp. Protest at 3-6.
Assuming, for the sake of argument, we agree with the protester that the RFP’s language–stating that past performance as part of a joint venture would be evaluated based on the entirety of the work performed–was limited to the evaluation of relevancy (and did not expressly pertain to the evaluation of performance quality), we find nothing in the RFP to support the basic premise of Jacobs’s argument. This premise, in essence, is that the solicitation required NASA to ascertain what work SLI performed within a submitted joint venture reference when evaluating performance quality of the reference. Indeed, the performance evaluation criteria only provides that “[t]he past performance of joint ventures will be evaluated[,]” which, by itself, does not support to the protester’s view. AR, Tab 2.06, RFP amend. 5 at 2504.
Moreover, the RFP language Jacobs cites as requiring a proportional evaluation of past performance does not in fact support its position. The proposal submission instructions referenced by the protester concern a requirement for offerors to specify the portions of the PWS the offeror’s teaming and joint venture partners would be responsible for in performance of the TEST3 requirement. This provision does not, however, concern how NASA would evaluate historical performance. AR, Tab 2.06, RFP amend. 5 at 2475. Thus, in our view, the protester fails to persuasively explain why it would be appropriate for the agency to consider the entirety of SLI’s joint venture past performance references when considering relevancy, but not consider the entirety of the reference when evaluating the quality of the performance for the same reference. In the absence of clear direction in the solicitation to the contrary, we conclude that the agency’s decision to evaluate SLI’s performance based on the entirety of work it performed as part of a joint venture, was not unreasonable or inconsistent with the terms of the solicitation.
Next, Jacobs argues that NASA improperly equated Jacobs’s past performance with SLI’s, leading NASA to unreasonably conclude that both offerors warranted a confidence rating of “very high.” In the protester’s view, SLI’s references were less relevant and smaller than Jacobs’s; therefore SLI should have received a lower rating. In addition, Jacobs argues that NASA undervalued the relevancy of some of its past performance references. Protest at 16-18; Comments and Supp. Protest at 6-14; Supp. Comments at 7-9. Our review of the record, however, confirms the agency’s past performance evaluation was reasonable.
Regarding NASA’s evaluation of the relevance of SLI’s past performance, Jacobs makes two principal arguments. First, Jacobs argues that SLI’s past performance references were smaller in terms of dollar value, and less relevant in terms of content and complexity, as compared to its own references. In this regard, the protester avers that all but one of SLI’s submitted past performance references fall below the $80 million average annual contract value for the TEST3 effort. Comments and Supp. Protest at 7-8. Moreover, from a content and complexity vantage point, Jacobs argues that SLI’s references were not sufficiently similar to the TEST3 PWS to warrant a rating of very relevant. Id. at 7-8. The agency argues the protester views relevancy with too narrow a lens. The solicitation did not provide that NASA would exclusively compare the dollar values of submitted past performance references to the instant requirement; instead, the RFP advised that the agency would evaluate relevancy based on size, content and complexity of submitted contracts. AR, Tab 2.06, RFP amend. 5 at 2503. Though how NASA would evaluate size was not specifically explained in the RFP, the underlying record demonstrates that the agency considered contract value and the contract’s work year equivalents (WYEs).[9] AR, Tab 7.01, Source Selection Decision Presentation at 5462; Supp. Contracting Officer’s Statement (Supp. COS) at 4-5.
We view the agency’s use of these metrics, and its evaluation conclusions, as reasonable. The record demonstrates that the agency thoughtfully considered both the contract value and number of WYEs when assessing relevancy, and reasonably concluded that SLI’s submitted efforts were either much the same size or essentially the same size as the TEST3 effort.[10] AR, Tab 6.01, SLI Final Evaluation at 5307-5314 (showing that the SEB examined and compared the dollar value and WYEs for each reference to the TEST3 contract before making a determination as to size similarity). For example, concerning SLI’s performance on the Test Facilities Operation, Maintenance, and Engineering (TFOME) contract, NASA noted that the contract’s $31 million annual value was “a smaller contract value than that which is anticipated on TEST3[.]” Id. at 5308. However, the source evaluation board (SEB) found that because SLI “manages 350 WYEs demonstrating SLI’s familiarity in working with a workforce essentially the same size as that which is anticipated on TEST3[,]” when considering “both size elements of the TFOME contract [and] the TEST3[,] . . . [the] SEB determined this contract is much of the size as TEST3.” Id.; see also id. at 5308-5309 (concluding that SLI’s Engineering Test and Integration Services II contact to be “much the same size” as the TEST3 effort based on an annual contract value of $50 million and 250 WYE). While the protester may disagree with the agency’s evaluation, given the agency’s broad discretion to evaluate the relevancy of an offeror’s past performance reference, we find no basis to conclude the agency’s conclusions were unreasonable. See Silverback7, Inc., B-408053.2, B-408053.3, Aug. 26, 2013, 2013 CPD ¶ 216 at 8-9.
Jacobs’s second challenge to the agency’s evaluation of the relevance of SLI’s past performance centers on the agency’s finding that SLI’s past performance, overall, warranted a rating of very relevant, despite the agency having rated SLI’s individual references as merely relevant. Comments and Supp. Protest at 8-10; Supp. Comments at 7-8. In effect, the protester argues that the sum of SLI’s past performance cannot be greater than its parts. Comments and Supp. Protest at 10. The record does not support the protester’s conclusion.
The SEB examined each of the offerors’ past performance references for relevancy by comparing the historical work performed to the TEST3 PWS subsections in terms of content, complexity, and size. Supp. COS at 5-6. The SEB, through this evaluation, assessed the degree to which each PWS task area was represented by an offeror’s prior performance history, and then aggregated this information across all the PWS task areas to arrive at an overall assessment for each offeror. Id. at 6-7. Thus, while SLI’s past performance references were, individually, only rated as relevant, the SEB concluded that the references, taken together, demonstrated effort of essentially the same content, complexity, and size as the TEST3 effort. Id.; AR, Tab 6.01, SLI Final Evaluation at 5314. We find the agency’s conclusions concerning the relevancy of SLI’s past performance unobjectionable.[11]
Jacobs next turns to NASA’s evaluation of the relevance of its own past performance, arguing that the agency understated the relevance of two of its contracts–the Johnson Space Center Engineering, Technology, and Science (JETS) contract, and Air Force Operations Maintenance Information Management Support (OMIMS) contract. Protest at 16-18; Comments and Supp. Protest at 10-12; Supp. Comments at 8-9. NASA assigned Jacobs’s JETS contract, for the provision of engineering and scientific products and technical services at the Johnson Space Center, a rating of somewhat relevant. The agency assigned this rating because it found that “Jacobs’ effort on JETS involved some of the content and essentially the same complexity and size as TEST3” PWS. AR, Tab 6.02, Jacobs Final Evaluation at 5372. More specifically, the SEB found that the “JETS contract supports Jacobs’ performance in 19 of the 44 TEST3 PWS sub-sections.” Id. at 5371. Additionally, of the seven core PWS sub-sections identified in the TEST3 PWS, the JETS contract provided evidence that Jacobs had experience with “much of the content” in only one core sub-section, and “some of the content” in two other core sub-sections.[12] Id. The SEB also found that there was “little to no evidence” demonstrating that the JETS contract included work in 25 of the PWS sub-sections. Id. at 5372.
Concerning the OMIMS contract, the agency assigned the reference a rating of relevant, based on a finding that “Jacobs’ effort on OMIMS involved much of the content and essentially the same complexity and size as TEST3” requirement. Id. at 5374. The SEB found this contract reference to support Jacobs’s performance in 39 of the 44 TEST3 PWS sub-sections. For the seven core PWS sub-sections, the agency found that the OMIMS contract evidenced the protester’s experience with “essentially the same content” in one core area, “much of the content” in another core sub-section, and “some of the content” in two other areas. Id. The SEB, however, found that the OMIMS contract demonstrated little to no evidence of the protester’s experience with five other PWS sub-sections. Id.
While the protester disagrees with the agency’s relevancy assessment for both past performance references, we conclude that the record demonstrates the reasonableness of NASA’s evaluation. By way of example, Jacobs disputes the agency’s finding that the JETS contract only reflected “some of the content” in two core sub-sections. According to the protester, its proposal demonstrated its performance on JETS matched all of the TEST3 core PWS sub-sections, to include sub-section 3.7, concerning flight acceptance standard testing. See PWS at 2056-2057. The protester asserts that it highlighted how its performance on the JETS contract overlapped with sub-section 3.7, to include providing “exceptional support for flight and special test equipment and development and processing . . . including engineering drawings, analysis reports, technical specifications and reports, as well as procedures and operations manuals as appropriate for [hardware] or [software].” AR Tab 5.02, Jacobs Final Proposal Revision at 4771. However, the SEB reasonably concluded that this reference was more accurately encompassed by PWS sub-section 3.6, concerning spaceflight component services. Supp. COS at 10. The protester’s disagreement with NASA’s past performance evaluation conclusions, without more, is insufficient to establish that the evaluation was improper. Centerra Group, LLC, B‑414800, B-414800.2, Sept. 21, 2017, 2017 CPD ¶ 307 at 8-10. Accordingly, we conclude the agency’s past performance evaluation was reasonable.[13]
Mission Suitability
Jacobs also challenges the agency’s evaluation under the lone technical factor, mission suitability, arguing NASA treated offerors unequally in the assignment of strengths and significant strengths, applied unstated evaluation criteria, and deviated from the evaluation requirements. Protest at 18-25; Comments and Supp. Protest at 14-26; Supp. Comments at 9-16. The evaluation of proposals is primarily a matter within the agency’s discretion, since the agency is responsible for defining its needs and identifying the best method for accommodating them. International Preparedness Assocs. Inc., B-415416.3, Dec. 27, 2017, 2017 CPD ¶ 391 at 4. In reviewing a protest challenging an agency’s evaluation, our Office will not reevaluate proposals; rather, we will examine the record to determine whether the agency’s evaluation conclusions were reasonable and consistent with the terms of the solicitation and applicable procurement laws and regulations. OPTIMUS Corp., B-400777, Jan. 26, 2009, 2009 CPD ¶ 33 at 4. A protestor’s disagreement with the agency’s judgment, by itself, is not sufficient to establish that an agency acted unreasonably. Hughes Network Sys., LLC, B-409666.5, B-409666.6, Jan. 15, 2015, 2015 CPD ¶ 42 at 6.
Regarding the unequal assignment of strengths and significant strengths, Jacobs argues that NASA credited several features of SLI’s proposal with a number of significant strengths and strengths under various mission suitability subfactors, yet the agency failed to give the protester similar credit, even though its proposal offered the same features. It is a fundamental principle of federal procurement law that a contracting agency must treat all vendors equally and evaluate their proposals evenhandedly against the solicitation’s requirements and evaluation criteria. Rockwell Elec. Commerce Corp., B-286201 et al., Dec. 14, 2000, 2001 CPD ¶ 65 at 5. However, when a protester alleges unequal treatment in a technical evaluation, it must show that the differences in the evaluation did not stem from differences between the proposals. IndraSoft, Inc., B-414026, B-414026.2, Jan. 23, 2017, 2017 CPD ¶ 30 at 10; Paragon Sys., Inc.; SecTek, Inc., B-409066.2, B-409066.3, June 4, 2014, 2014 CPD ¶ 169 at 8-9.
Here, the record demonstrates that the differences with the evaluation findings reasonably derive from differences in the offerors’ proposed technical approaches, most notably the relative lack of detail in Jacobs’s proposal.[14] As one representative example, Jacobs alleges that NASA treated offerors unequally under the third sub-element under the technical approach subfactor, TA3, concerning the Mars ascent vehicle flow control valve test support task order. Comments and Supp. Protest at 5279-5280. SLI received a significant strength for its technical solution to this task order (for its project planning, test system design, and test system operations), while Jacobs’s approach to TA3 warranted only a strength (for the firm’s approach to [DELETED]). Compare AR, Tab 6.01, SLI Final Evaluation at 5279-5280 with AR, Tab 6.02, Jacobs Final Evaluation at 5346.
In documenting her selection decision discussing Jacobs’s proposal, the SSA notes she “agreed with the SEB that the lack of detail limited the finding to a Strength.” AR, Tab 7.02, SSS at 5504. The contracting officer, who was also a member of the SEB, elaborates on the lack of detail in Jacobs’s proposal, citing three specific examples. COS at 31-33. For instance, the contracting officer asserts that Jacobs stated that it would exceed the PWS requirements concerning project status reports, but its proposal did not provide any narrative detail or rationale to support this assertion. Id. at 32; Supp. COS at 14-15. While Jacobs may believe that it provided sufficiently detailed information to warrant a significant strength like SLI, given the discretion afforded to the agency in assessing the merits of a proposal, we cannot conclude that the agency acted unreasonably or treated offerors in a disparate manner. The protest allegation is denied.[15]
Next, the protester contends that the agency relaxed the solicitation’s evaluation criteria when it evaluated SLI’s proposal. Protest at 23; Comments and Supp. Protest at 23-24. SLI received a weakness under sub-element MA2, concerning its safety and health approach.[16] The SSA, who wanted more information concerning this weakness, sought input from an ex officio member of the SEB, the safety & mission assurance representative, about the potential impact to the TEST3 program. AR, Tab 7.02, SSS at 5501. This representative found that the issues identified in SLI’s proposal “are commonplace for new, onsite contractors and usually pose little risk to the Government since compliance with all requirements would be mandatory prior to initiating any onsite work.” Id. Though the SSA “felt comfortable that the safety and health of NASA personnel and missions would be protected,” she believed SLI’s approach still warranted a weakness. Id. Jacobs argues that “[b]y taking the safety representative’s input into consideration and relaxing the requirement to minimize the evaluators’ concerns with SLI’s safety plan, NASA’s evaluation departed from the RFP and relied on unstated evaluation criteria that improperly benefitted SLI.” Comments and Supp. Protest at 23.
We have no basis to question the agency’s evaluation. A source selection official may make an independent evaluation judgment, provided the basis for that judgment is reasonable and documented in the contemporaneous record. Thoma-Sea Marine Constructors, LLC, B-416240, B-416240.2, July 16, 2018, 2018 CPD ¶ 245 at 10. Here, the agency did not relax the evaluation criteria. Instead, the SSA used available resources to make an informed decision concerning the potential impact of a weakness assigned to SLI’s proposal; this is exactly the type of thoughtful consideration procurement officials are charged with, and reflects the essential exercise of the SSA’s discretion to weigh the potential impact of a proposal risks on agency operations. We find nothing objectionable with the agency’s conclusions.
Finally, the protester challenges the agency’s assignment of a weakness for Jacobs’s total compensation plan under the management approach subfactor, MA3. The agency assigned the weakness because it found that Jacobs had proposed to pay labor rates below those required by the Service Contract Act (SCA). Protest at 24-25; Comments and Supp. Protest at 24-26; Supp. Comments at 15. The protester contends that this weakness (and a $40,000 upward adjustment to its cost/price proposal) was unwarranted because Jacobs’s proposal complied with the SCA wage determination attached to the RFP. Id. However, we need not reach the merits of this issue because Jacobs cannot demonstrate it was prejudiced by the agency’s actions.
Competitive prejudice is an essential element of any viable protest; where the protester fails to demonstrate that, but for the agency’s actions, it would have had a substantial chance of receiving the award, there is no basis for finding prejudice, and our Office will not sustain the protest, even if deficiencies in the procurement are found. DynCorp Int’l LLC, supra, at 10 n.13. Here, even without this weakness, Jacobs’s rating under the MA subfactor could not have resulted in a change from a rating of good to very good. Without the weakness, Jacobs would have four strengths under this subfactor, but because it was not assigned any significant strengths, it was ineligible for a rating of very good. See AR, Tab 1, TEST3 Evaluation Plan at 6 (noting that a rating of very good requires a proposal where “[o]ne or more significant strengths have been found, and strengths outbalance any weaknesses that exist.”). Moreover, Jacobs’s numerical score, 298, under the MA subfactor was already the highest numerical score available for the adjectival rating of good. Id. Additionally, the SSA found this weakness to be “relatively minor[.]” AR, Tab 7.02, SSS at 5501. Because SLI’s proposal would remain higher rated and lower priced, we cannot conclude that Jacobs’s competitive position would be altered with the removal of this weakness and a downward cost adjustment of $40,000 to its proposed cost/price of $251.7 million.
The protest is denied.
Edda Emmanuelli Perez
General Counsel
[1] Citations to the agency report are to the uniform bates numbering applied by the agency.
[2] The RFP advised that the sub-elements “are listed in no particular order of importance and not assigned any predetermined weight or points.” AR, Tab Tab 2.06, RFP amend. 5 at 2499-2500.
[3] Though not immediately relevant to the protest allegations, the IE and SBU subfactors each had two identified evaluation sub-elements. See id. at 2500-2501.
[4] Each individual confidence rating had both a performance and relevance component. AR, Tab Tab 2.06, RFP amend. 5 at 2504.
[5] Evaluation point totals for the subfactors are indicated in parentheses.
[6] As provided in the solicitation, the agency would perform a cost analysis to determine NASA’s “best estimate of the cost of the contract that is most likely to result from the Offeror’s proposal.” AR, Tab 2.06, RFP amend. 5 at 2507. Before adjustments, SLI’s proposed cost was $244.8M, while Jacobs’s proposed cost was $251.7M. AR, Tab 7.02, SSS at 5496.
[7] Jacobs raises other collateral allegations. Although our decision does not specifically address them all, we have considered each argument and find that none provides a basis on which to sustain the protest. For example, Jacobs alleges that NASA did not adequately document its evaluation conclusions, specifically with regard to the agency’s evaluation of proposals under the past performance and mission suitability factors. Comments and Supp. Protest at 27-28; Supp. Comments at 2-4.
Based on our review of the full record, the protester’s argument is without merit. The underlying evaluation record supports the agency’s conclusions, and the contracting officer’s statements provide further explanation of the agency’s rationale. While Jacobs argues that the contracting officer’s statements are post hoc explanations that should be entitled to no weight, when reviewing an agency’s evaluation, we do not limit our review to contemporaneous evidence, but consider all of the information provided, including the parties’ arguments and explanations. Science Applications Int’l Corp., Inc., B-408270, B-408270.2, Aug. 5, 2013, 2013 CPD ¶ 189 at 8 n.12. Post-protest explanations that provide a detailed rationale for contemporaneous conclusions and simply fill in previously unrecorded details will generally be considered in our review of the rationality of selection decisions, so long as those explanations are credible and consistent with the contemporaneous record. Remington Arms Co., Inc., B-297374, B-297374.2, Jan. 12, 2006, 2006 CPD ¶ 32 at 12. Here, the contracting officer, who was also a member of the source evaluation board, provides explanations that are credible and consistent with the contemporaneous record.
[8] The RFP provided that “[t]he evaluation team will evaluate the past performance of the Offeror and team members, as defined in L.23.” AR, Tab 2.06, RFP amend. 5 at 2502. Section L.23, concerning how offerors were to submit their past performance proposals, in relevant part, provides:
Submit information on contracts that is considered relevant in demonstrating the ability to perform the proposed effort. The submission shall include rationale supporting the Offeror’s assertion of relevance. This submission shall clearly detail what portions of the PWS, the prime, major/minor subcontractors, teaming partner, and/or joint venture partner are responsible for and the specific resources (workforce, management, facilities, or other resources) to be employed and relied upon to perform the proposed effort. If the Past Performance volume includes data on any affiliated company, division(s), business units, segments, or other organizations of the Offeror, then provide a narrative to address what they will be responsible for and/or proposing to do and the specific resources (workforce, management, facilities, or other resources) to be employed and relied upon, such that said parent et al will have meaningful involvement in contract performance.
Id. at 2475.
[9] WYEs, while not defined in the record, appears to refer to the measure of a contractor employee’s labor for a standard working year.
[10] The TEST3 contract was valued at $80 million annually and would utilize an estimated 400 WYEs. AR, Tab 6.01, SLI Final Evaluation at 5303.
[11] The protester argues that the contracting officer’s statement providing detail to the SEB’s evaluation conclusions amounts to nothing more than a post hoc justification that should be accorded little weight. However, we view the contracting officer’s explanations as consistent with the underlying record and as merely providing additional detail to the agency’s existing conclusions documented in the record. See Remington Arms Co., Inc., supra at 12.
[12] The RFP identified section 3.0 of the PWS, concerning test and evaluation services, as containing the core PWS sub-sections, specifically sub-sections 3.1-3.7. PWS at 2047.
[13] The protester also contends that the agency’s decision to consider a contract Jacobs did not include as a past performance reference tainted the agency’s relevancy determination and led to a skewed comparison of the relative merits of the offerors’ past performance. Comments and Supp. Protest at 13-14. Setting aside the fact that the RFP expressly provided that NASA “reserves the right to use both data provided by the Offeror and data obtained from other sources” in its past performance evaluation, we fail to see how the consideration of this contract competitively prejudiced Jacobs; the protester received the highest possible rating for relevancy, performance quality, and overall confidence. AR, Tab 2.06, RFP amend. 5 at 2502; see DynCorp Int’l LLC, B-411465, B-411465.2, Aug. 4, 2015, 2015 CPD ¶ 228 at 12-13 (“Competitive prejudice is an essential element of any viable protest; where the protester fails to demonstrate that, but for the agency’s actions, it would have had a substantial chance of receiving the award, there is no basis for finding prejudice, and our Office will not sustain the protest, even if deficiencies in the procurement are found.”).
[14] While we specifically address only one illustrative example, here, we have considered each argument raised by the protester regarding unequal treatment and find none provides a basis to sustain the protest.
[15] The protester also alleges that NASA failed to perform an integrated evaluation. Protest at 22-23; Comments and Supp. Protest at 21-23. The RFP provided that “the Government will conduct an integrated evaluation to consider consistency among proposal information.” AR, Tab 2.06, RFP amend. 5 at 2499. We agree with the protester that it does not appear from the protest record that the SEB conducted such an evaluation. See e.g. AR, Tab 7.02, SSS at 5500 (stating “The SEB explained they evaluated each subfactor element under [management approach subfactor] within the confines of that subfactor element. . . .”). However, the record demonstrates that the SSA undertook such an evaluation. For example, she questioned the SEB’s assignment of a significant strength to SLI for MA4 (concerning recruitment and retention), despite assigning a weakness to SLI for MA3 (concerning total compensation plan). Id. Accordingly, to the extent the SEB did not perform an integrated analysis, we conclude Jacobs suffered no prejudice where the SSA conducted such an integrated evaluation. See DynCorp Int’l LLC, supra, at 10 n.13.
[16] The SEB explained that the main concerns with SLI’s approach were: “an [Occupational Safety and Health Administration]-compliant process safety plan that fell short of specific application to TEST3 operations; a lack of details regarding the review and approval of hazardous operations procedures beyond the test and readiness review process; and mishap investigation team membership that did not align with [Johnson Space Center Procedural Requirements] 1700.1.” AR, Tab 7.02, SSS at 5501; see also AR, Tab 6.01, SLI Final Evaluation at 5271-5273. Moreover, the SSA noted that SLI’s proposal “only provided a general discussion of process safety and hazardous operations and lacked a level of detail to fully understand how those activities would be applied to TEST3.” AR, Tab 7.02, SSS at 5501.
Related
Related
Related
Related
Related
Related
Related
Stay informed as we add new reports & testimonies.

source