TITLE: B-310825; B-310825.2, DRSC3 Systems, LLC, February 26, 2008
BNUMBER: B-310825; B-310825.2
DATE: February 26, 2008
***********************************************************
B-310825; B-310825.2, DRSC3 Systems, LLC, February 26, 2008

   DOCUMENT FOR PUBLIC RELEASE
   The decision issued on the date below was subject to a GAO Protective
   Order. This redacted version has been approved for public release.

   Decision

   Matter of: DRSC3 Systems, LLC

   File: B-310825; B-310825.2

   Date: February 26, 2008

   David Z. Bodenheimer, Esq., Puja Satiani, Esq., and James G. Peyster,
   Esq., Crowell & Moring, LLP, for the protester.

   W. Jay DeVecchio, Esq., Edward Jackson, Esq., Damien C. Specht, Esq., and
   Kevin C. Dwyer, Esq., Jenner & Block LLP, for General Dynamics Advanced
   Information Systems, an intervenor.

   Andrew C. Saunders, Esq., and Alex F. Marin, Esq., Naval Sea Systems
   Command, for the agency.

   Louis A. Chiarella, Esq., and Christine S. Melody, Esq., Office of the
   General Counsel, GAO, participated in the preparation of the decision.

   DIGEST

   1. Protest alleging that firm had developed governmentwide standard
   applicable to the item being procured, thereby having an unfair
   informational advantage over other competitors, is denied where record
   establishes that firm did not have a role in developing the relevant
   governmentwide standard.

   2. A competitive advantage that derives from an offeror's previous
   performance under a government contract is not an unfair competitive
   advantage that agency is required to neutralize.

   3. Contracting agency engaged in meaningful discussions where agency
   advised protester of specific weaknesses regarding lack of a selected
   software architecture approach; agency was not required to also afford the
   protester an opportunity to cure proposal defects first introduced either
   in response to discussions or in a post-discussion proposal revision.

   4. Protest challenging the evaluation of technical proposals is denied
   where the record establishes that the agency's evaluation was reasonable
   and consistent with the evaluation criteria.

   5. Protest that past performance evaluation was unreasonable is sustained
   where record shows that: the findings in the agency evaluation report were
   not consistent with the information upon which the findings were based;
   the agency evaluators could not remember whether they evaluated and gave
   proper consideration to adverse past performance information regarding the
   awardee; and the agency did not properly assess the relevance of the
   offeror's prior contracts.

   DECISION

   DRS C3 Systems, LLC protests the award of a contract to General Dynamics
   Advanced Information Systems (GD) under request for proposals (RFP) No.
   N00024-06-R-5103, issued by the Naval Sea Systems Command (NAVSEA),
   Department of the Navy, for common enterprise display system (CEDS)
   display consoles. DRS argues that the agency's evaluation of offerors'
   proposals and subsequent source selection decision were improper. DRS also
   contends that the agency's discussions with the protester regarding its
   proposal were not meaningful, and that GD had an impermissible
   organizational conflict of interest.

   We sustain the protest in part regarding the agency's evaluation of GD's
   past performance and deny the remainder of the protester's allegations.

   BACKGROUND

   On April 17, 2006, the agency issued the RFP for the CEDS display
   consoles.[1] The CEDS display console is a workstation configuration
   comprised of display screens, furniture (e.g., console mounting brackets,
   chair), human/machine interface devices (e.g., keyboard, mouse, joystick),
   and a common electronics module (CEM), including both a network interface
   and graphics processor. In general terms, the RFP's statement of work
   (SOW) required the contractor to design, develop, produce, and support an
   enterprise family of display systems to be implemented across platform
   systems on Navy surface and subsurface ships. Agency Report (AR), Tab 1,
   CEDS Source Selection Plan, at 5.

   The RFP also informed offerors that the CEDS display console procurement
   would occur in two phases. In Phase I, the Navy intended to award multiple
   fixed-price contracts for the preliminary design of the display consoles.
   In Phase II, in which Phase I awardees were to submit detailed business
   and technical proposals for the actual execution of the CEDS display
   console project, the Navy intended to select the offeror proposing the
   best value to the agency.[2] Id. at 5-6. On December 13, the agency
   awarded Phase I preliminary design contracts to both GD and DRS. It is the
   Navy's subsequent evaluation of offerors' Phase II proposals and source
   selection decision that is the subject of DRS's protest here.

   The Phase II RFP contemplated the award of an
   indefinite-delivery/indefinite- quantity (ID/IQ) contract including both
   fixed-price and cost-reimbursement-type contract line item numbers (CLIN)
   for a CEDS display console first article unit, up to 601 production units,
   as well as associated spares, logistics, and various program, technical,
   engineering, and training services over a 4-year performance period.
   RFP sect. B; amend. 3, at 2. In addition to price, the RFP identified (in
   descending order of importance) technical approach, management approach
   and capabilities, and past performance as the nonprice evaluation factors,
   along with numerous subfactors, of equal importance within each factor.
   Id., amend. 1, Instructions to Offerors, at 61-65. The solicitation also
   established that the nonprice factors, when combined, were significantly
   more important than price.[3] Id. at 61. Award was to be made to the
   responsible offeror whose proposal was determined to represent the "best
   value" to the government, all factors considered. Id. at 58.

   Both GD and DRS submitted proposals by the April 12, 2007 closing date. A
   Navy source selection evaluation board (SSEB) evaluated offerors'
   proposals as to the nonprice factors and subfactors using an adjectival
   rating system that was set forth in the RFP: outstanding; very good;
   satisfactory; marginal; unsatisfactory; and with regard to the past
   performance factor, neutral. Id., amend. 2, at 5-6. An agency cost
   evaluation team separately reviewed offerors' price and cost submissions.

   After completing its initial evaluation, the agency decided that
   discussions with offerors were necessary, and established a competitive
   range consisting of the GD and DRS proposals. The Navy conducted written
   discussions with both offerors, followed by the offerors' submission of
   final proposal revisions (FPR) by August 16. The Navy's final evaluation
   ratings of the GD and DRS proposals were as follows:

   +------------------------------------------------------------------------+
   |                                       |       GD        |     DRS      |
   |---------------------------------------+-----------------+--------------|
   |Technical Approach                     | Outstanding[4]  |  Very Good   |
   |---------------------------------------+-----------------+--------------|
   |Management Approach and Capabilities   | Outstanding[5]  |  Very Good   |
   |---------------------------------------+-----------------+--------------|
   |Past Performance                       |   Outstanding   | Outstanding  |
   |---------------------------------------+-----------------+--------------|
   |Evaluated Cost/Price                   |   $64,809,680   | $80,884,381  |
   +------------------------------------------------------------------------+

   AR, Tab 20, Final SSEB Report, at 1-51; Tab 21, Final Cost Evaluation
   Report, at 1-40.

   Importantly, the Navy's evaluation was as much about determining the
   number of strengths and weaknesses within the offerors' proposals as it
   was the assigned adjectival ratings. The SSEB found that GD's proposal had
   42 strengths (31 major, 11 minor) and no weaknesses under the technical
   approach factor, 35 strengths (17 major, 18 minor) and no weaknesses under
   the management approach and capabilities factor, and 16 strengths
   (13 major, 3 minor) and no weaknesses under the past performance factor.
   By contrast, the SSEB determined that DRS's proposal had 38 strengths (17
   major, 21 minor) and 3 weaknesses (1 major, 2 minor) under the technical
   approach factor, 29 strengths (17 major, 12 minor) and 2 weaknesses
   (1 major, 1 minor) under the management approach and capabilities factor,
   and 19 strengths (13 major, 6 minor) and no weaknesses under the past
   performance factor. Id., Tab 20, Final SSEB Report, encl. 1, SSEB Briefing
   Slides, at 13, 16.

   On September 7, the SSEB and cost evaluation teams briefed the agency
   source selection advisory council (SSAC) as to their respective ratings
   and findings of the offerors' proposals. The SSAC adopted the evaluation
   findings and ratings without exception and subsequently recommended that
   contract award be made to GD. Id., Tab 22, SSAC Report, at 1-6. On October
   10, after having reviewed the evaluation reports, findings, and
   recommendations, the source selection authority determined that GD's
   higher technically rated, lower-priced proposal represented the best value
   to the government. Id., Tab 23, Source Selection Decision. This protest
   followed.

   DISCUSSION

   DRS's protest raises numerous challenges to the Navy's evaluation of
   offerors' proposals. First, the protester alleges that GD had an
   impermissible organizational conflict of interest that the Navy failed to
   recognize and take into account in its evaluation of proposals. Second,
   DRS alleges that the agency failed to engage in meaningful discussions
   with the firm regarding its technical proposal. Third, the protester
   contends that the Navy's evaluation of offerors' proposals under the
   technical and management factors was in various ways improper. Fourth, DRS
   contends that the Navy performed a flawed cost evaluation of GD's
   proposal. Lastly, DRS argues that the agency's evaluation of GD's past
   performance was improper.[6] As detailed below, we find that the Navy's
   evaluation of GD's proposal under the past performance factor was
   improper. Although we do not here specifically address all of DRS's
   remaining arguments, we have fully considered all of them and find that
   they are without merit.

   Organizational Conflict of Interest

   DRS first protests that GD had an organizational conflict of interest
   (OCI) which the Navy overlooked. Specifically, the protester contends
   that, with respect to a key CEDS requirement relating to "separation
   kernels," GD gained inside knowledge and helped to shape this same
   requirement as the prime contractor for the National Security Agency (NSA)
   high assurance platform (HAP) program. Despite this allegedly unfair
   competitive advantage on GD's part, the protester argues, the Navy failed
   to consider or to mitigate this OCI. Protest, Nov. 19, 2007, at 11-12.

   The RFP included both the SOW, which established the contract
   requirements, and SRD, which established the performance, design,
   development, and test requirements for the CEDS display console itself.
   The SRD was developed entirely by the Navy, with no support from GD or any
   other contractor. AR, Dec. 19, 2007, at 5; Tr. at 22-24. One of the most
   significant SRD requirements was that regarding the separation kernel. RFP
   amend. 1, SRD sect. 3.6.2.3. A separation kernel is essentially a piece of
   software that creates independent and isolated software program execution
   environments (i.e., "partitions") so as to keep separate, but process
   simultaneously, information from different security classifications (e.g.,
   "secret," "top secret") under the same operating system. Id.; Tr. at
   31-32. The partitions within a separation kernel do not know that other
   partitions exist, and the information within any one partition cannot be
   transferred or shared across partitions. Relevant to the protest here, the
   SRD separation kernel requirement stated that "the candidate operating
   system shall meet the requirements of the `U.S. Government Protection
   Profile for Separation Kernels in Environments Requiring High
   Robustness.'" RFP amend. 1, SRD sect. 3.6.2.3.

   The U.S. Government Protection Profile for Separation Kernels in
   Environments Requiring High Robustness, also referred to as the separation
   kernel protection profile (SKPP), is the governmentwide standard for
   separation kernel operating environments. The record reflects that,
   beginning in December 2002, the SKPP standard was developed by NSA in
   support of the F-22 Raptor and Joint Strike Fighter military aircraft
   programs. AR, Tab 27, NSA Declarations, at 1. The NSA working group that
   created the SKPP standard consisted of government employees with
   assistance from MITRE (a federally-funded research and development center)
   and the Naval Post Graduate School. Additionally, external input to the
   SKPP standard has been limited to those parties that responded to draft
   versions of the document that were released for public comment, and did
   not include GD. Id.

   In July 2006, prior to the Navy's award of Phase I preliminary design
   contracts for the CEDS program, GD was awarded a separate contract by NSA
   for the HAP program. GD Comments, Dec. 31, 2007, exh. 2, HAP contract,
   exh. 3, HAP Statement of Work. In general terms, the HAP program involved
   the development of a next-generation secure computing workstation and
   architecture for the military's Special Operations Command in which
   information from different security levels could be processed
   simultaneously. GD's work on the HAP program required it to deliver to NSA
   various computing architecture documents, software, running systems, and
   related program documents. Importantly, none of GD's work on the HAP
   program involved the development or the delivery of NSA's SKPP standard.
   AR, Tab 27, NSA Declarations, at 2. Further, NSA has not used, nor does it
   intend to use, any of GD's work on the HAP program for the development of
   the SKPP standard. Id.

   The record shows that while both CEDS and the HAP program involve
   processing information from multiple security levels simultaneously, the
   two programs apply different separation technologies and approaches; the
   requirements in the two programs here are also qualitatively different.
   CEDS requires, at a minimum, the ability to simultaneously and separately
   process information from six different security classifications under the
   same operating system, while the HAP program involves separating
   information in two adjacent levels of security classification. CEDS
   involves a real-time operating system (i.e., the results of one process
   are available in time for the next computing process which requires the
   previous result) and the HAP program does not. Further, while CEDS
   utilizes separation kernel technology that is to be certified by NSA
   against the most rigorous security assurance requirements, the HAP program
   does not involve the use or adaptation of a separation kernel, or mandate
   compliance with the same security assurance requirements. RFP amend. 1,
   SRD sect. 3.6.2.3; GD Comments, Dec. 31, 2007, exh. 3, HAP Statement of
   Work, attach. A, Declaration of Bill Ross, at 4-8. In sum, from the record
   before us, it appears that to the extent that GD was familiar with
   separation kernel technology, it was not as a result of its work on the
   NSA HAP program.[7]

   Contracting officers are required to identify and evaluate potential OCIs
   as early in the acquisition process as possible. Federal Acquisition
   Regulation (FAR) sect. 9.504(a)(1). The FAR provides that an OCI exists
   when, because of other activities or relationships with other persons or
   organizations, a person or organization is unable or potentially unable to
   render impartial assistance or advice to the government, or the person's
   objectivity in performing the contract work is or might be otherwise
   impaired, or the person has an unfair competitive advantage. See FAR sect.
   2.101. Situations in which OCIs arise, as addressed in FAR subpart 9.5 and
   the decisions of our Office, are generally associated with a firm's
   performance of a government contract and can be broadly categorized into
   three groups: (1) unequal access to information cases, where the primary
   concern is that a government contractor has access to nonpublic
   information that would give it an unfair competitive advantage in a
   competition for another contract; (2) biased ground rules cases, where the
   primary concern is that a government contractor could have an opportunity
   to skew a competition for a government contract in favor of itself; and
   (3) impaired objectivity cases, where the primary concern is that a
   government contractor would be in the position of evaluating itself or a
   related entity (either through an assessment of performance under a
   contract or an evaluation of proposals in a competition), which would cast
   doubt on the contractor's ability to render impartial advice to the
   government. Mechanical Equip. Co., Inc. et al., B-292789.2 et al.,
   Dec. 15, 2003, 2004 CPD para. 192 at 18; Aetna Gov't Health Plans, Inc.;
   Foundation Health Fed. Servs., Inc., B-254397.15 et al., July 27, 1995,
   95-2 CPD para. 129 at 12-13. DRS's allegation concerning GD here is
   primarily that it had an unfair competitive advantage as a result of its
   work under the HAP contract.

   We find DRS's central assertion--that as the HAP contractor GD improperly
   gained inside knowledge and helped to shape the separation kernel
   standards applicable to the CEDS procurement--to be unfounded. As a
   preliminary matter, there is no evidence (and DRS does not assert
   otherwise) that GD had a role in the development of the actual CEDS
   separation kernel requirements. Further, GD's work on the HAP program did
   not result in the offeror having a role in the development of NSA's SKPP
   standard. As set forth above, the record clearly reflects that the HAP
   program did not involve the use of separation kernel technology, none of
   GD's work on the HAP program involved the development or the delivery of
   NSA's SKPP standard, and none of GD's work product from the HAP program
   was used by NSA for the development of the SKPP standard.[8] Further, GD's
   work on the HAP program was not directly applicable to the much more
   difficult technology and security assurance requirements set forth in the
   CEDS SRD: at most, GD's work on the HAP contract taught the offeror what
   would not work for the CEDS procurement. There is simply no merit to DRS's
   allegation that GD helped to shape the NSA separation kernel standards
   that applied to the CEDS procurement, and any exposure that GD had to
   separation kernel technologies and the corresponding NSA standard was a
   competitive advantage that the Navy had no duty to neutralize. Gonzales
   Consulting Servs., Inc., B-291642.2, July 16, 2003, 2003 CPD para. 128 at
   7; Government Bus. Servs. Group, B-287052 et al., Mar. 27, 2001, 2001 CPD
   para. 58 at 10.

   DRS also argues that GD's own technical proposal indicates that the
   offeror had gained inside information and would be able to influence the
   NSA SKPP standard. Specifically, the protester points to the following
   excerpt from the GD proposal:

                                   [Deleted]

   * * * *  *

                                   [Deleted]

   AR, Tab 6, GD Proposal, vol. II, Technical Proposal, at II-4.12.

   As a preliminary matter, we note that the protester selectively quotes
   from GD's proposal here and does not set forth the full, page-length
   discussion. Further, DRS's reliance on this portion of GD's proposal is
   misplaced. As discussed above, neither the HAP program nor the CEDS
   procurement has any role in shaping the NSA SKPP standards and
   corresponding certification process. We fail to see how the portion of
   GD's proposal to which the protester cites here suggests otherwise.

   DRS also asserts that it was denied a briefing by NSA regarding separation
   kernel technology, but that such a briefing occurred between NSA and GD,
   thereby providing GD with an unfair competitive advantage. DRS Comments,
   Dec. 31, 2007, at 10-11.

   GD's proposal, as part of its separation kernel trade studies analysis,
   included a statement that [deleted].[9] AR, Tab 6, GD Proposal, vol. II,
   Technical Proposal, at II-1.358. By contrast, during the CEDS solicitation
   process, NSA denied a DRS request for a meeting involving the parties'
   technical representatives. DRS Protest, Dec, 31, 2007, exh. 8, Email from
   NSA to DRS.

   The record does thus indicate that NSA denied DRS's request for a meeting
   (presumably regarding separation kernel standards and certification),
   while GD, as the HAP program contractor, was able to brief NSA on its
   efforts in developing software separation kernel technology. DRS fails to
   explain, however, how GD's briefing of NSA (rather than the other way
   around) provided GD with access to any information that it did not already
   possess. Further, GD's statement in its proposal to the Navy that NSA had
   expressed support for its software separation kernel technology and would
   begin planning to fold it into the HAP program in no way establishes
   unequal access to information or an unfair competitive advantage on GD's
   part.

   Lack of Meaningful Discussions

   DRS protests that the agency failed to hold meaningful discussions with
   it. Specifically, the protester alleges that the Navy's technical
   evaluation found only one major weakness in DRS's final proposal--that its
   proposed separation kernel architecture would violate the SRD requirements
   for a POSIX-compliant operating system.[10] DRS contends that the Navy
   never raised this purported weakness with it during discussions. The
   protester argues that as its proposal explicitly identified its separation
   kernel architecture, the Navy had a duty to raise any concerns associated
   with DRS's choice of separation kernel architecture in order for the
   discussions to be meaningful. Protest, Nov. 19, 2007, at 17-19.

   The SRD stated, with regard to the CEDS CEM processing subsystem:

     The Display Console hardware shall be capable of loading and supporting
     any conventional POSIX compliant operating system, which complies with
     the operating system requirements called out in the [Open Architecture
     Computing Environment] Technologies and Standards, Sections 4.5 and 5.5.
     [. . . , and]

   * * * * *

     The [operating environment] OE shall be POSIX compliant. The OE shall be
     designed to maintain compatibility and interoperability between previous
     and current configurations of equipment.

   RFP amend. 1, SRD sections 3.6.2.1(b)(2)(a), (b)(3)(c).

   DRS submitted its technical proposal as part of its initial submission on
   April 12. DRS's initial proposal did not identify a specific separation
   kernel vendor or architecture; rather, the proposal identified three
   possible separation kernel vendors, [deleted], that it was
   considering.[11] AR, Tab 5, DRS's Proposal, vol. II, Technical Proposal,
   at C-II-221 thru 226. The SSEB evaluated DRS's initial proposal as very
   good under the management approach and capabilities factor, and as
   satisfactory under the technical approach factor, and identified various
   strengths and weaknesses supporting its rating determinations. AR, Tab 11,
   Initial SSEB Report, at 28-40. The SSEB considered DRS's lack of a
   definitive separation kernel architecture approach to be a major weakness
   under both of these evaluation factors.[12] Specifically, the SSEB stated:
   "The lack of a separation kernel approach will impact the schedule thereby
   adding risk to the program to meet schedule milestone (i.e., [Critical
   Design Review], [Test Readiness Review], [Production Readiness Review]),"
   and "Without the selection [of a] separation kernel vendor [DRS's]
   architecture approach may not be achievable within schedule requirements."
   Id. at 32, 39.

   After making its competitive range determination, the Navy conducted
   discussions with each offeror, including DRS. Among the list of discussion
   issues regarding DRS's technical and management proposal, the agency
   stated:

     The lack of a selected Separation Kernel approach will impact the
     schedule thereby adding risk to the program to meet schedule milestone
     (i.e., Critical Design Review, Test Readiness Review, and Production
     Readiness Review. This was determined to be a weakness. [. . . , and]

   * * * * *

     Without the selection of a Separation Kernel vendor, the architecture
     approach may not be achievable within the schedule requirements and was
     determined to be a weakness.

   Id., Tab 14, Agency Discussions with DRS, Encl. 1, List of Discussion
   Issues, at 1.

   In its response to the Navy's discussion questions, DRS addressed the
   issue of a lack of a selected separation kernel architecture as follows:

     Based on our detailed analysis to date, we now have a more specific
     viewpoint of the [deleted] products. As a result, we firmly believe that
     it is in our best interest to advance a specific [separation kernel]
     solution. The specific [deleted] product that we are selecting is the
     [deleted] solution.

   Id., Tab 15, DRS Letter to Navy, encl. 1, Responses to Discussion Items,
   at 2.

   The SSEB considered DRS's discussion responses and subsequent FPR as part
   of its final evaluation of offerors' proposals. The evaluators determined
   that DRS's response here generally alleviated the agency's original
   concern of schedule risk associated with the offeror's lack of a selected
   separation kernel approach. Id., Tab 20, SSEB Final Report, at 37, 45.
   However, the SSEB found that DRS's choice of separation kernel [deleted]
   also caused a new concern, namely that the proposed use of [deleted] as
   the separation kernel architecture would violate the SRD requirements
   requiring a POSIX-compliant operating system. The agency considered this
   to be a major weakness in DRS's final proposal, affecting the offeror's
   evaluation ratings under various subfactors and both the technical and
   management prime factors.

   Although discussions must address deficiencies and significant weaknesses
   identified in proposals, the precise content of discussions is largely a
   matter of the contracting officer's judgment. See FAR sect. 15.306(d)(3);
   American States Utils. Servs., Inc., B-291307.3, June 30, 2004, 2004 CPD
   para. 150 at 6. We review the adequacy of discussions to ensure that
   agencies point out weaknesses that, unless corrected, would prevent an
   offeror from having a reasonable chance for award. Northrop Grumman Info.
   Tech., Inc., B-290080 et al., June 10, 2002, 2002 CPD para. 136 at 6. When
   an agency engages in discussions with an offeror, the discussions must be
   "meaningful," that is, sufficiently detailed so as to lead an offeror into
   the areas of its proposal requiring amplification or revision. Hanford
   Envtl. Health Found.,
   B-292858.2, B-292858.5, Apr. 7, 2004, 2004 CPD para. 164 at 8. Where
   proposal defects are first introduced either in a response to discussions
   or in a post-discussion proposal revision, an agency has no duty to reopen
   discussions or conduct additional rounds of discussions. L-3 Commc'ns
   Corp., BT Fuze Prods. Div., B-299227,
   B-299227.2, Mar. 14, 2007, 2007 CPD para. 83 at 19; Cube-All Star Servs.
   Joint Venture,
   B-291903, Apr. 30, 2003, 2003 CPD para. 145 at 10-11.

   We conclude that the Navy's discussions with DRS were meaningful. As set
   forth above, the discussions expressly informed DRS of the specific
   weaknesses that the SSEB had identified in its initial proposal. Further,
   the record clearly reflects that the specific significant weakness which
   DRS claims that the Navy failed to mention in discussions was first
   introduced in DRS's discussion responses and was not part of its initial
   proposal. As a result, the Navy had no obligation to conduct additional
   rounds of discussions in order to permit the offeror to address this
   matter. See L-3 Commc'ns Corp., BT Fuze Prods. Div., supra.

   DRS does not dispute that its original proposal did not identify its
   selection of a specific separation kernel architecture, nor does it argue
   that the Navy's discussions failed to accurately convey the proposal
   weaknesses originally identified by the SSEB. Nevertheless, DRS alleges
   that the Navy's discussions were not meaningful insofar as the agency knew
   that [deleted] was DRS's design choice before discussions with offerors
   had "closed."[13] Thus, DRS argues, the Navy did not need to "reopen"
   discussions here in order to advise DRS that the agency viewed its design
   choice as a weakness or deficiency. DRS Comments, Dec. 31, 2007, at 17-19.

   As a preliminary matter, we see no basis to conclude that discussions
   closed on any date other than the date on which offerors' responses were
   due (August 6); the fact that, a week later, the agency confirmed the due
   date for FPRs has no bearing on this issue. Further, we recognize that
   there may be certain situations where, given the manner in which the
   discussions are held, the agency may not remain silent when an offeror
   introduces a matter during discussions which the agency regards as a
   proposal defect. E.g., Voith Hydro, Inc., B-277051, Aug. 22, 1997, 97-2
   CPD para. 68 at 3 (where, in written response to an area of weakness
   identified by agency, protester introduced a new weakness, and agency and
   protester thereafter engaged in oral discussions, agency was required to
   advise offeror that it regarded the new matter as a weakness). This case
   does not involve such a situation. The record here reflects that the
   Navy's discussions with offerors were conducted in writing, and did not at
   any point involve back-and-forth exchanges of information. Further, the
   SSEB did not complete its evaluation of DRS's discussions responses, and
   first identify DRS's selection of [deleted] as a proposal defect, until
   September 13, well after discussions had ended and FPRs had been
   submitted. In sum, under the circumstances here, the agency was not
   required to conduct additional discussions regarding this defect.
   Operational Res. Consultants, Inc., B-299131, B-299131.2, Feb. 16, 2007,
   2007 CPD para. 38 at 12; MD Helicopters, Inc.; AgustaWestland, Inc.,
   B-298502 et al., Oct. 23, 2006, 2006 CPD para. 164 at 48 n.47.

   Evaluation of GD's Technical Proposal

   DRS protests that the Navy's evaluation of GD's technical proposal was
   improper. The protester argues that the agency should have rejected GD's
   proposal as technically unacceptable because it failed to comply with all
   SRD requirements. DRS cites to two specific CEDS display screen
   requirements--those involving the display of acoustic "waterfall" (i.e.,
   flicker-free) data[14] and color resolution--that GD's proposal allegedly
   failed to meet.[15] Based on these specific instances of noncompliance,
   the protester maintains, the Navy should have found GD's proposal
   technically unacceptable overall and ineligible for award. DRS Protest,
   Nov. 19, 2007, at 19-22.

   In reviewing an agency's evaluation, we will not reevaluate technical
   proposals; instead, we will examine the agency's evaluation to ensure that
   it was reasonable and consistent with the solicitation's stated evaluation
   criteria and procurement statutes and regulations. Urban-Meridian Joint
   Venture, B-287168, B-287168.2, May 7, 2001, 2001 CPD para. 91 at 2. An
   offeror's mere disagreement with the agency's evaluation is not sufficient
   to render the evaluation unreasonable. Ben-Mar Enters., Inc., B-295781,
   Apr. 7, 2005, 2005 CPD para. 68 at 7. Our review of the record here shows
   the agency's evaluation of GD's proposal to be unobjectionable.

   The solicitation informed offerors that proposals were to be sufficiently
   detailed so as to enable the agency to make a thorough evaluation and to
   arrive at a sound determination as to whether or not the prospective
   offeror would be able to perform in accordance with the stated
   requirements. RFP amend. 1, Instructions to Offerors, at 42. The RFP also
   stated that "[i]f one (1) or more of the evaluation Factors or Subfactors
   are determined to be Unsatisfactory, the entire proposal may be rendered
   technically unacceptable and ineligible for award."[16] Id. at 74
   (emphasis omitted).

   The SRD contained hundreds, if not thousands, of requirements for the CEDS
   system. SRD section 3 established the actual CEDS system requirements
   while SRD section 4 established the test standards by which the Navy would
   verify the successful offeror's compliance with the section 3
   requirements. Tr. at 176, 183-84. Relevant to the protest here, the SRD
   included the following requirements regarding the CEDS display screens:

     Acoustic Data. The screens shall be suitable for displaying acoustic
     "waterfall" data.

     1.The screen shall be capable of displaying dense high-contrast shifting
     images (such as a sonogram "waterfall" output) without causing eyestrain
     to an operator as defined by MIL-STD-1472 and ASTM F1166. [. . . , and]

   * * * * *

     ECDIS-N. The display console shall be Electronic Chart Display and
     Information System -- Navy (ECDIS-N) certifiable. Graphics capabilities
     shall be compatible with and meet the requirements to display navigation
     applications [in accordance with Operational Navy Instruction] 9420.2
     (ECDIS-N performance requirements).

   RFP amend. 1, SRD sections3.6.1.1(d)(1), 3.6.2.1(e)(7). The SRD's display
   screen requirements were not new or developmental in nature; the Navy had
   used similar standards for its predecessor display console system, the
   Q70, which the contractor there had been able to successfully achieve. Tr.
   at 173.

   GD's technical approach proposal, consisting of more than 500 pages,
   included sections which addressed both the "waterfall" data display and
   color resolution requirements. Specifically, the proposal described GD's
   [deleted], as well as the various functional and performance properties of
   its display consoles in relation to the SRD requirements. AR, Tab 6, GD's
   Proposal, vol. II, Technical Proposal, at II.1.106 thru 112, 296 thru 302.
   GD's proposal also expressly represented that its display screens would be
   suitable for displaying acoustic waterfall data in accordance with SRD
   sect. 3.6.1.1(d), and provided information as to how GD would achieve the
   requirement.[17] Id. at II.1.112. Further, GD's proposal represented that
   its display screens would comply with the color requirements of ECDIS-N
   and applicable Navy instruction. Id., app. B, Requirements Verification
   Test Matrix, at 23; Tab 32, GD Prime Item Development Specification, at
   86.

   The record shows that, when evaluating offerors' proposals, the SSEB
   clearly considered certain SRD requirements to be more challenging than
   others. The evaluators believed the separation kernel requirements to be
   very demanding insofar as the work here was almost developmental in nature
   and had not been achieved before. Tr. at 32-33. By contrast, the SSEB did
   not consider the CEDS display screen requirements to be as difficult to
   meet. Tr. at 179-80. The evaluators were aware that the display screen
   requirements here were similar to those successfully achieved on the
   Navy's prior Q70 display system, that GD had previously produced display
   systems which displayed acoustic waterfall data for Navy attack
   submarines, and that several other commercial companies produced display
   systems that met SRD requirements. Tr. at 173-75. In light of this
   information, as well as the market surveys that the Navy had performed
   prior to release of the RFP here, the evaluators were not significantly
   concerned about the ability of offerors to meet the SRD display screen
   requirements. See Tr. at 179-81, 189.

   The SSEB determined that GD's proposal met or exceeded all solicitation
   requirements. AR, Tab 20, Final SSEB Report, at 11-25. Relevant to the
   protest here, the evaluators found that GD's proposal met (but did not
   exceed) the SRD display screen requirements regarding both acoustic
   waterfall display and color resolution. Tr. at 171. The SSEB concluded
   that, given the perceived degree of difficulty of the display screen
   requirements here as well as the information provided by GD in its
   proposal, the offeror both understood and expressly agreed to comply with
   the SRD requirements. Tr. at 233-34.

   We conclude that the agency's evaluation of GD's technical proposal was
   reasonable and consistent with the stated evaluation criteria. As a
   preliminary matter, DRS confuses the actual SRD requirements with the
   standards by which the Navy would later test the successful offeror's CEDS
   system for compliance. It was only SRD section 3 that established the
   actual requirements which offerors' proposals were to address, and there
   was simply no requirement that proposals also address the SRD test
   verification standards. Additionally, while the RFP required proposals to
   provide sufficient detail to determine whether the offeror would be able
   to perform in accordance with the stated requirements, the RFP did not
   require offerors to demonstrate that they had already achieved the SRD
   requirements in order to be found technically acceptable. The record
   reflects that GD's proposal addressed both of the SRD requirements that
   DRS claims were lacking.[18]

   DRS argues that GD's proposal should have been found to be technically
   unacceptable because it did little more than recite, or "parrot back," the
   SRD requirements in these two specific areas. The protester maintains that
   an offeror's ability to quote a specification verbatim does not establish
   technical compliance. DRS Comments, Jan. 18, 2008, at 30.

   A proposal with significant informational deficiencies may be found
   technically unacceptable, and an offeror's extensive parroting of an RFP's
   requirements may be considered as evidence of the offeror's failure to
   demonstrate a clear understanding of those requirements. See Government
   Telecomms., Inc., B-299542.2, June 21, 2007, 2007 CPD para. 136 at 5;
   Wahkontah Servs.. Inc., B-292768, Nov. 18, 2003, 2003 CPD para. 214 at 7.
   Here, however, the record reflects that GD's proposal was extremely
   detailed in nature, fully demonstrating that the offeror understood and
   would meet the CEDS requirements. While GD's proposal may not have been as
   detailed in these two specific areas as it was in other parts, we have no
   basis to conclude that the agency's determination that GD's proposal was
   technically acceptable was unreasonable.[19] We find DRS's challenge to
   the Navy's evaluation amounts to mere disagreement with the agency's
   judgment and, thus, does not establish that the evaluation was
   unreasonable. JAVIS Automation & Eng'g, Inc., B-293235.6, Apr. 29, 2004,
   2004 CPD para. 95 at 5.

   Evaluation of DRS's Technical Proposal

   DRS protests that the Navy's evaluation of its technical proposal was
   improper in various ways. The protester primarily argues that DRS was
   treated unequally in the evaluation process.[20] DRS maintains that in
   many instances, under both the technical and management factors, the Navy
   failed to recognize various aspects of DRS's proposal as strengths even
   though DRS's proposal was identical to that of GD and the agency
   determined that GD's proposal warranted strengths in these same areas.
   Although we do not address all of the protester's arguments regarding the
   agency's evaluation of its proposal, including all the alleged
   "unrecognized DRS strengths," we have fully considered them all and find
   no basis upon which to sustain the protest.

   It is a fundamental principle of federal procurement law that a
   contracting agency must treat all offerors equally and evaluate their
   proposals evenhandedly against the solicitation's requirements and
   evaluation criteria. Rockwell Elec. Commerce Corp., B-286201 et al., Dec.
   14, 2000, 2001 CPD para. 65 at 5; CRAssociates, Inc., B-282075.2,
   B-282075.3, Mar. 15, 2000, 2000 CPD para. 63 at 5. Our review of the
   record confirms that the Navy evaluated offerors' proposals equally under
   the technical and management factors, and that the difference in
   evaluation ratings here was not the result of unequal treatment by the
   agency but instead stemmed from the agency's recognition of differences in
   the offerors' proposals.

   For example, DRS argues that because the Navy found GD's proposal to have
   several major strengths for its separation kernel solution [deleted], the
   agency should have likewise found similar strengths in DRS's proposal.
   Protest, Dec. 31, 2007, at 31-33. The SSEB determined that GD's proposal
   warranted strengths in this area because the offeror had established
   [deleted] in which the GD [deleted] the separation kernel software
   architecture needed for the CEDS system. AR, Tab 6, GD Proposal, vol. II,
   Technical Proposal, at II.1-358; Tab 19, GD FPR, at 1; Tab 20, Final SSEB
   Report, at 15, 23-24; Tr. at 52-66. The evaluators believed that GD's
   [deleted] would result in [deleted] solution, thereby reducing the risk
   associated with providing and certifying the separation kernel
   architecture. AR, Tab 20, Final SSEB Report, at 15, 23-24; Tr. at 56. By
   contrast, the SSEB determined that although DRS's FPR included [deleted].
   AR, Tab 18, DRS FPR, at II.C-234f, l; Tr. at 57-62. Additionally, as DRS's
   proposal did not reflect an [deleted], the fact that the offeror planned
   to [deleted] did not alleviate the SSEB's concerns that the [deleted]
   development of hardware and software could increase performance risk. Tr.
   at 61-62.

   In our view, the agency's evaluation of DRS's proposal here was reasonable
   and consistent with the stated evaluation criteria. First, the SSEB
   reasonably judged GD's [deleted] to be of value to the agency, thereby
   warranting strengths in this regard. Further, the SSEB reasonably
   determined that DRS's proposal did not evidence the same [deleted] as
   existed in GD's proposal. The agency reasonably determined that the
   proposals of GD and DRS were different in this regard and in light
   thereof, rated the proposals differently.

   Cost/Price Evaluation

   DRS protests that the Navy failed to perform a proper evaluation of GD's
   cost and price proposal. Specifically, the protester alleges that the
   agency's cost analysis failed to adequately consider whether the awardee's
   proposed costs were realistic to perform the work required by the RFP. In
   conjunction with its assertion that GD's proposed display glass fails to
   meet all SRD requirements, DRS alleges that the cost of SRD-compliant
   glass is [deleted] greater than that proposed by GD and that this accounts
   for approximately [deleted] of the cost difference between the offerors'
   proposals.[21] The protester also maintains that GD's technical
   noncompliance would result in cost increases to the Navy during contract
   performance, and the agency's failure to reasonably determine GD's
   realistic costs adversely affected the resulting source selection
   decision.[22] DRS Protest, Nov. 19, 2007, at 27-28.

   The reasonableness of an agency's cost or price evaluation is directly
   related to the financial risk that the government bears because of the
   contract type it has chosen. When an agency evaluates proposals for the
   award of a cost-reimbursement contract (or cost-reimbursement portion of a
   contract), an offeror's proposed costs of contract performance are not
   considered controlling because, regardless of the costs proposed by an
   offeror, the government is bound to pay the contractor its actual and
   allowable costs. FAR sect. 16.301-1; Metro Mach. Corp., B-295744,
   B-295744.2, Apr. 21, 2005, 2005 CPD para. 112 at 9. Consequently, an
   agency must perform a cost realism analysis to determine the extent to
   which an offeror's proposed costs represent what the contract performance
   should cost, assuming reasonable economy and efficiency. FAR
   sections 15.305(a)(1), 15.404-1(d)(1), (2); Magellan Health Servs.,
   B-298912, Jan. 5, 2007, 2007 CPD para. 81 at 13; The Futures Group Int'l,
   B-281274.2, Mar. 3, 1999, 2000 CPD para. 147 at 3. By contrast, when an
   agency evaluates proposals for the award of a fixed-price contract (or
   fixed-price portion of a contract), in which the government's liability is
   fixed and the contractor bears the risk and responsibility for the actual
   costs of performance, see FAR sect.16.202-1, the analysis of an offeror's
   price need only determine that the price offered is fair and reasonable to
   the government (i.e., price reasonableness), and focuses primarily on
   whether the offered price is higher--as opposed to lower--than
   warranted.[23] See CSE Constr., B-291268.2, Dec. 16, 2002, 2002 CPD para.
   207 at 4; WorldTravelService, B-284155.3, Mar. 26, 2001, 2001 CPD para. 68
   at 4 n.2.

   As set forth above, the RFP contemplated the award of an ID/IQ contract
   including both fixed-price and cost-reimbursement CLINs, as follows:[24]

   +------------------------------------------------------------------------+
   |CLIN |           Supply or Service            |      Contract Type      |
   |-----+----------------------------------------+-------------------------|
   |0003 |First Article Unit                      |Cost Plus Award Fee      |
   |-----+----------------------------------------+-------------------------|
   |0005 |Production Units - Year 1               |Fixed Price Incentive[25]|
   |-----+----------------------------------------+-------------------------|
   |0006 |Production Units -- Years 2 thru 4      |Firm Fixed Price         |
   |-----+----------------------------------------+-------------------------|
   |0007 |Spares & Installation Checkout Hardware |Firm Fixed Price         |
   |-----+----------------------------------------+-------------------------|
   |0008 |Performance Based Logistics             |Firm Fixed Price         |
   |-----+----------------------------------------+-------------------------|
   |0010 |Program Services                        |Cost Plus Award Fee      |
   |-----+----------------------------------------+-------------------------|
   |0011 |Technical & Engineering Services        |Cost Plus Award Fee      |
   |-----+----------------------------------------+-------------------------|
   |0013 |Training Services                       |Time and Materials       |
   +------------------------------------------------------------------------+

   RFP sect. B, at 1-7. The RFP also informed offerors how the agency would
   perform its cost/price evaluation and specified the unit quantities to be
   considered for each CLIN (e.g., CLINs 0005 and 0006 involved a total of
   601 CEDS display consoles and 127 CEMS). RFP amend. 3, at 3.

   The Navy's cost/price evaluation of the GD and DRS FPRs, as corrected, was
   as follows:[26]

   +------------------------------------------------------------------------+
   |    CLIN    | GD Proposed | GD Evaluated | DRS Proposed | DRS Evaluated |
   |------------+-------------+--------------+--------------+---------------|
   |    0003    |  [deleted]  |  [deleted]   |  [deleted]   |   [deleted]   |
   |------------+-------------+--------------+--------------+---------------|
   | 0005/0006  |  [deleted]  |  [deleted]   |  [deleted]   |   [deleted]   |
   |------------+-------------+--------------+--------------+---------------|
   | 0007/0008  |  [deleted]  |  [deleted]   |  [deleted]   |   [deleted]   |
   |------------+-------------+--------------+--------------+---------------|
   |    0010    |  [deleted]  |  [deleted]   |  [deleted]   |   [deleted]   |
   |------------+-------------+--------------+--------------+---------------|
   |    0011    |  [deleted]  |  [deleted]   |  [deleted]   |   [deleted]   |
   |------------+-------------+--------------+--------------+---------------|
   |    0013    |  [deleted]  |  [deleted]   |  [deleted]   |   [deleted]   |
   |------------+-------------+--------------+--------------+---------------|
   |   Total    | $67,960,422 | $67,958,161  | $84,335,367  |  $84,313,512  |
   +------------------------------------------------------------------------+

   AR, Dec. 19, 2007, at 38; AR, Tab 21, Final Cost Evaluation Report, at
   33-40.

   As shown above, the offerors' evaluated costs/prices were essentially the
   same as those the offerors proposed. In fact, the only instance where the
   agency took exception to the costs and prices as proposed and made certain
   minor adjustments was with regard to cost-reimbursement CLIN 0003, the
   first article unit. AR, Tab 21, Final Cost Evaluation Report, at 8-11.

   DRS maintains that the Navy's cost realism evaluation of GD's proposal was
   unreasonable, not because GD's proposed costs are unrealistic in
   comparison to the particular methods of performance and materials
   described in the offeror's technical proposal, but rather because GD's
   proposed costs are unrealistic in comparison to the work to be performed
   (specifically, the SRD display glass requirements). The protester alleges
   that because GD's proposed display glass was noncompliant and would
   thereby result in cost increases to the government for the entire duration
   of the CEDS program, the Navy failed in its duty to perform a proper cost
   realism evaluation. We disagree.

   As a preliminary matter, we note that most of the CLINs here either did
   not involve display glass or were fixed-price in nature. Specifically,
   CLINs 0010, 0011, and 0013 do not involve display glass. Further, the
   majority of CLINs which involve display glass (0005 thru 0008) are
   fixed-price, thereby establishing contractual limits on the Navy's cost
   liability. In fact, the only CLIN that both involves display glass and is
   cost-reimbursement in nature is CLIN 003, the First Article Unit. By
   contrast, DRS's assertion that approximately [deleted] of the cost
   difference between the offerors' proposal is attributable to GD's
   noncompliant display glass is based on consideration of the fixed-price
   production units. Quite simply, the cost-reimbursement portion of the CEDS
   procurement involving display glass, for which DRS alleges the Navy failed
   in its duty to perform a proper cost realism analysis, was limited to the
   first article unit. Moreover, as DRS's assertion that the Navy failed to
   perform a proper cost realism analysis of GD's proposal is factually
   premised on the claim that the awardee's technical proposal was
   noncompliant, as we have determined that the Navy's technical evaluation
   of GD's proposal was reasonable, we also find no merit in the protester's
   indirect challenge here to the agency's technical evaluation of proposals.

   Past Performance Evaluation

   DRS challenges the agency's evaluation of GD's past performance. The
   protester maintains that the Navy failed to properly consider various
   adverse past performance information regarding GD when conducting its
   evaluation. DRS also argues that various strengths which the SSEB
   identified in its evaluation of GD's past performance are inconsistent
   with the underlying past performance information regarding the awardee.
   DRS argues that, in light thereof, the agency's decision to rate GD's past
   performance as "outstanding," and equivalent to that of the protester, was
   inconsistent with the RFP's stated evaluation criteria. The protester also
   contends that it was prejudiced as a result of the Navy's flawed past
   performance evaluation here.

   As a general matter, the evaluation of an offeror's past performance is a
   matter within the discretion of the contracting agency, and we will not
   substitute our judgment for reasonably based past performance ratings.
   However, we will question an agency's evaluation conclusions where they
   are unreasonable or undocumented. Clean Harbors Envtl. Servs., Inc.,
   B-296176.2, Dec. 9, 2005, 2005 CPD para. 222 at 3; OSI Collection Servs.,
   Inc., B-286597, B-286597.2, Jan. 17, 2001, 2001 CPD para. 18 at 6. The
   critical question is whether the evaluation was conducted fairly,
   reasonably, and in accordance with the solicitation's evaluation scheme,
   and whether it was based on relevant information sufficient to make a
   reasonable determination of the offerors' past performance. Clean Harbors
   Envtl. Servs., Inc., supra. As detailed below, the agency's past
   performance evaluation here did not meet this standard.

   The RFP instructed offerors, with regard to past performance, to
   demonstrate "how the proposed team's past experience and quality
   performance on programs of similar complexity make it qualified to execute
   the CEDS program (describe relevant and pertinent past performance for
   prime and major subcontractors)." RFP amend. 1, Instructions to Offerors,
   at 58. Offerors were also required to submit relevant experience--that is,
   contracts on-going or completed in the previous 5 years that involved work
   similar to the CEDS procurement in terms of technology, type of effort
   (development, production, and maintenance), contract scope, schedule and
   risk--for evaluation.[27] Id. at 60. Additionally, as part of its
   evaluation of offerors' past performance, the agency reserved the right to
   obtain information from sources other than those provided by the offerors.
   Id. at 58.

   The NAVSEA contracting officer gathered and provided to the SSEB the
   offerors' past performance information. The past performance information
   regarding GD consisted of both contractor performance assessment reports
   (CPAR) and past performance questionnaires for the offeror itself and its
   proposed subcontractors.[28] AR, Tab 6, GD Past Performance Information.
   Relevant to the protest here, the GD past performance information also
   included a CPAR for General Dynamics [deleted] regarding its performance
   of the Navy's multifunctional cryptographic systems (MCS) contract.[29]

   The MCS CPAR was very negative in its assessment of the contractor's
   performance. Specifically, GD [deleted] received ratings of
   "unsatisfactory" for the areas of technical (quality of product),
   schedule, and cost control, and "marginal" for management. Id., MCS CPAR,
   at 2. The CPAR found the reliability of the delivered MCS product to be of
   concern: "Given the number of software defects and performance issues with
   the baseline, the government began to question the viability of the
   product in terms of long-term performance and reliability for the Fleet."
   Id. at 3. The CPAR also found program schedule growth to be of concern:
   numerous software problems and extensive regression testing periods caused
   repeated delays resulting in the government questioning "the viability of
   the product and the loss of confidence in the program that it would
   achieve the planned capability performance for MCS." Id. The CPAR also
   found cost control to be a significant concern: "The government's concern
   with the contractor's cost projections was based on the contractor's
   inability to adequately estimate the remaining efforts; this has been an
   on-going issue throughout the life of the program." Id. Additionally,
   notwithstanding the contractor's view that the problems experienced here
   resulted from Navy- furnished information and equipment, the CPAR reflects
   that the agency reviewing official agreed with the agency assessing
   official's evaluation here. Id. at 4-5.

   The SSEB rated GD's past performance as "outstanding" overall and
   determined that the awardee's proposal had 16 strengths (13 major,
   3 minor) and no weaknesses under this evaluation factor. Among the various
   strengths the SSEB found relating to GD's past performance were:

     Majority (three out of four) CPARs reviewed for the Contractor, rated
     [deleted] as exceptional or very good in technical, product performance,
     systems engineering, logistics support/ sustainment, schedule, cost
     control and management (Major).

     Contractor provided 8 non-[contract data requirements list] deliverables
     which provided advanced insights into the Contractor's management plans
     and processes (i.e., QA Plan, Program Management Plan and Risk Plan)
     (Major).

     GD[] met schedule for Phase I deliverables, and provided drafts of
     8 deliverables not required yet for Phase I (e.g., [quality assurance]
     plan, SEMP, T&E Plan), thus reducing risk of on-time delivery of awarded
     contractor for Phase II (Major).

   AR, Tab 20, Final SSEB Report, at 30. Importantly, the SSEB's report did
   not mention the adverse CPAR ratings regarding the MCS contract. The
   agency evaluation report also did not indicate to what extent, if at all,
   the agency considered the relevance of the past performance information
   received regarding GD. Id.

   In its protest, DRS argued that the Navy had improperly disregarded
   adverse past performance information regarding GD. The protester
   maintained that the agency evaluation report failed to recognize and take
   into account the MCS CPAR, even though it was relevant to every past
   performance subfactor. DRS argued that the Navy's failure to take this
   adverse past performance information regarding the awardee into account
   constituted a departure from the stated evaluation criteria that was
   prejudicial to DRS. Protest, Dec. 31, 2007, at 53-58.

   In its report to our Office, the Navy originally argued that the SSEB had
   reasonably disregarded the MCS CPAR as part of its evaluation of GD's past
   performance. The agency contended that only two specific divisions of
   General Dynamics--[deleted]--would be involved in performing the CEDS work
   here, while the MCS CPAR involved another GD division--[deleted]. Because
   the past performance information involved a General Dynamics division that
   would not be performing work on the CEDS project, the agency argued, it
   was reasonable not to consider this information as relevant in the
   evaluation of the awardee's past performance.[30] AR, Jan. 11, 2008, at
   34.

   In its comments to the agency report, DRS provided information to
   demonstrate that GD [deleted] was in fact [deleted]. Specifically, GD
   [deleted] had been merged by the parent company into [deleted] "with the
   integrated unit continuing to operate as [deleted]." DRS Comments, Jan.
   18, 2008, at 20. Quite simply, DRS argued, the specific General Dynamics
   division mentioned in the adverse MCS CPAR was one of the two General
   Dynamics divisions that the agency acknowledged would be performing the
   CEDS work here. Thus, the protester maintained, the Navy's stated factual
   basis for not considering the MCS CPAR was completely inaccurate. Id. at
   19-20.

   At the hearing conducted by our Office, the SSEB chairman originally
   testified that the agency evaluators did not see and did not consider the
   MCS CPAR as part of their evaluation of GD's past performance. Tr. at
   203-05. The Navy, however, subsequently introduced evidence that the MCS
   CPAR had in fact been considered by the SSEB in its evaluation of GD's
   past performance insofar as the evaluation report included specific
   findings that could only be attributable to the MCS CPAR.[31] Id. at
   366-70. The SSEB chairman stated, however, that he still had no
   recollection of ever having considered the MCS CPAR as part of the
   agency's evaluation of GD's past performance. Id. at 361, 368-69, 376. For
   example, the following exchange took place with the SSEB chairman:

     Q: [C]orrect me if I'm wrong. You stated you don't remember considering
     the GD CPARs on MCS, correct?
     A: I believe I stated I don't recall seeing it.
     Q: Do you recall evaluating it?
     A: If I didn't see it, how can I actually evaluate it?
     Q: You mentioned that you had a conversation with the deputy on the
     SSEB, is that correct?
     A: Yes.

   * * * * *

     Q: And your recollection of that discussion with the deputy was that he
     also did not remember this CPARs?
     A: That's what he told me.
     Q: If you don't remember seeing it and the deputy doesn't remember
     seeing it, how do you know that you gave it proper consideration in the
     agency's past performance evaluation of GD?
     A: I don't know.

   Id. at 407-08.

   At the hearing conducted by our Office, the SSEB chairman also discussed
   how the evaluators considered the relevance of offerors' past performance
   information. At one point the lead evaluator indicated that the
   determination of whether an offeror's past performance was similar to the
   work to be performed was based on whether it involved the delivery of
   equipment: "We would look at the CPARs. We looked at the work. If it was
   similar in terms of they were producing a piece of equipment, we would
   count that as being similar." Id. at 214. At another point, the following
   exchange occurred with the SSEB chairman:

     Q: Did you give some references or some CPARs more weight than others
     because they were -- they were the same or similar, they were more
     relevant to the work here?
     A: I believe we evaluated and gave credit for every CPARs we received.

     * * * * *

       Q: I've looked at the SSEB report. . . . I did not see in here the
       agency's --the agency saying that some of the references were more
       relevant than others. Am I missing anything?
       A: No. We treat[ed] them all equally.
       Q: Regardless of relevance? And what if it was really good past
       performance, but it has nothing to do with the technology of CEDS. How
       much weight do you give that? Do you think that that should be weighed
       equally to something that is highly relevant and high quality?
       A: No.

     Id. at 211-13.

     The SSEB chairman also indicated that at least one of the strengths
     identified in the agency's report regarding GD's past performance was
     inaccurate. As set forth above, the SSEB report considered as a major
     strength the fact that a majority (i.e., three out of four) of the CPARs
     for proposed subcontractor [deleted] rated its performance as either
     exceptional or very good. The SSEB chairman acknowledged that this
     finding was inaccurate, and that instead two of the four CPARs for
     [deleted] had rated its performance as either outstanding or very good.
     Id. at 404.

     We conclude that the agency's evaluation of GD's past performance was
     not reasonable or consistent with the stated evaluation criteria. Of
     foremost concern, the record indicates that the Navy failed to give
     meaningful consideration to all the relevant past performance
     information that it possessed regarding GD. The evaluation report
     reflects that the SSEB was aware of, and apparently considered to some
     degree, the CPAR regarding the MCS contract. The agency cannot provide
     an explanation, however, as to why the contractor's self-serving
     rebuttal (which the Navy reviewing official for the MCS CPAR did not
     accept) merited two major strengths, while the extremely adverse
     information and ratings regarding the contractor's performance in the
     areas of technical, schedule, cost control, and management were
     completely ignored. Tr. at 378. Additionally, the SSEB chairman admits
     having no recollection that he ever saw or considered the MCS CPAR and,
     as a result, we cannot say that the Navy gave proper consideration to
     this adverse past performance information in its evaluation.[32] We fail
     to see how the agency can properly evaluate an offeror's past
     performance when its evaluators admittedly do not remember if all the
     past performance information was in fact considered.

     The record also reflects that the Navy failed to adequately consider the
     relevance of GD's past performance information as part of the
     evaluation. An agency is required to consider the similarity or
     relevance of an offeror's past performance information as part of its
     evaluation of past performance. See FAR sect. 15.305(a)(2) (the
     relevance of past performance information shall be considered); United
     Paradyne Corp., B-297758, Mar. 10, 2006, 2006 CPD para. 47 at 5-6; Clean
     Harbors Envtl. Servs., Inc., supra.

     The RFP here instructed offerors to provide past performance information
     that was "relevant and pertinent," and later defined "relevant" as
     similar to the CEDS procurement in terms of technology, type of effort,
     contract scope, schedule, and risk. RFP amend. 1, Instructions to
     Offerors, at 60, 62. The record does not reflect that the agency
     adequately considered whether GD's past performance information was in
     fact similar to the CEDS procurement in accordance with the RFP.

     The CPARs and questionnaires upon which the SSEB based their evaluation
     of GD's past performance furnished adjectival ratings and narratives
     regarding the quality of an offeror's performance in various areas. The
     contemporaneous evaluation report does not indicate that the agency went
     beyond considerations of quality and also considered the relevance of
     the offerors' past performance references. The SSEB's evaluation
     findings regarding GD concern the quality of the offeror's prior
     performance and indicate equal consideration of the offeror's past
     performance references without regard to relevance. Further, at the
     hearing conducted by our Office, the SSEB chairman's statements were, at
     best, ambiguous as to the agency's consideration of relevance.
     Specifically, the lead evaluator indicated that the SSEB gave equal
     consideration to all the offeror's past performance references,
     irrespective of relevance, and that the determination of what past
     performance was deemed "similar" was based simply on whether the prior
     work involved producing a piece of equipment. As the RFP required the
     agency to determine whether an offeror's past performance was similar to
     the CEDS procurement in terms of technology, type of effort, contract
     scope, schedule, and risk, we conclude that the agency did not properly
     consider the relevance of GD's past performance in its evaluation.

     The record also reflects various inaccuracies in the SSEB report
     regarding GD's past performance. As detailed above, the SSEB chairman
     admits that one of the strengths given to GD--that a majority of the
     CPARs for [deleted] rated it as exceptional or very good--was factually
     inaccurate. Moreover, the two strengths given to GD related to its MCS
     CPAR are redundant, as well as based entirely on assertions by the
     contractor with which the Navy reviewing official there did not agree.
     In addition, GD received a major strength for certain CEDS document
     deliverables that provided insight into the contractor's management
     plans and processes--a fact that has nothing to do with past
     performance. In sum, several of the SSEB's specific findings regarding
     GD's past performance are without factual justification.

     The Navy argues that notwithstanding any deficiencies in its evaluation
     of GD's past performance, the protest here should not be sustained
     because DRS was not prejudiced. Specifically, the agency maintains that
     given GD's significant advantage over DRS in the technical approach
     factor,[33] and the relative importance of the technical approach and
     past performance evaluation factors, it is impossible for DRS to be
     found technically superior to GD overall, thereby requiring a
     price/technical tradeoff which the agency did not originally have to
     make. In support thereof, the agency points to the fact that it was GD's
     undisputed technical advantages (i.e., strengths that existed only in
     GD's proposal) upon which the SSAC exclusively relied for its
     determination that GD's proposal was technically superior to that of
     DRS. Accordingly, the Navy argues, none of the alleged deficiencies
     regarding the agency's evaluation of GD's past performance can possibly
     change the conclusion that GD's proposal was technically superior
     overall to that of DRS. AR, Feb. 14, 2008, at 2-11, 22-24.

     Competitive prejudice is an essential element  of a viable protest;
     where the protester fails to demonstrate that, but for the agency's
     actions, it would have had a substantial chance of receiving the award,
     there is no basis for finding prejudice, and our Office will not sustain
     the protest. Joint Mgmt. & Tech. Servs., B-294229, B-294229.2, Sept. 22,
     2004, 2004 CPD para. 208 at 7; see Statistica, Inc. v. Christopher,
     102 F.3d 1577 (Fed. Cir. 1996).

     We recognize that GD's proposal was found to have technical strengths
     that DRS's did not, and that the RFP established that the technical
     approach factor was more important than the management approach and
     capabilities factor, which in turn was more important than the past
     performance factor. However, as detailed above, the record shows that
     the Navy's evaluation of GD's past performance was fundamentally flawed:
     it failed to adequately consider all relevant information; it failed to
     adequately consider the relevance of the offeror's past performance
     information; and several of the identified strengths are factually
     inaccurate and/or redundant. In light of these significant deficiencies
     in the agency's evaluation of GD's past performance, we simply cannot
     reasonably determine what GD's rating--or its strengths and
     weaknesses--should have properly been here. By contrast, in light of the
     errors which the Navy concedes occurred in other aspects of its
     evaluation of GD's proposal, it appears that the GD and DRS proposals
     have equivalent overall ratings of "very good" under both the technical
     approach and management approach and capabilities factors, and DRS
     received an "outstanding" rating for its past performance. Consequently,
     as we cannot determine that GD's proposal would remain technically
     superior overall, we conclude that the agency's actions here were
     prejudicial to the protester.

     RECOMMENDATION

     We recommend that the agency reevaluate offerors' past performance,
     giving due consideration to all relevant information as well as the
     relevance of the offerors' prior and current contracts and, based on
     that reevaluation, make a new source selection determination. If, upon
     reevaluation of proposals, DRS is determined to offer the best value to
     the government, the Navy should terminate GD's contract for the
     convenience of the government and make award to DRS. We also recommend
     that DRS be reimbursed the costs of filing and pursuing the protest,
     including reasonable attorneys' fees, limited to the costs relating to
     the ground on which we sustain the protest. 4 C.F.R. sect. 21.8(d)(1)
     (2007). DRS should submit its certified claim for costs, detailing the
     time expended and costs incurred, directly to the contracting agency
     within 60 days after receipt of this decision. 4 C.F.R. sect.
     21.8(f)(1).

     The protest is sustained in part and denied in part.

     Gary L. Kepplinger
     General Counsel

   ------------------------

   [1] The RFP established two categories of CEDS display systems--display
   consoles and remote displays--for which the Navy made separate award
   determinations. DRS's protest here concerns only the Navy's CEDS display
   consoles procurement.

   [2] The awarded Phase I contracts also served as the solicitation for
   Phase II proposals: they each included a SOW, a CEDS system requirements
   document (SRD), instructions to offerors regarding the submission of
   proposals, and evaluation factors for award. AR, Dec. 19, 2007, at 4. For
   purposes of this decision, further references to the RFP and/or
   solicitation refer to the Phase II procurement. Additionally, so as to
   avoid confusion, the citations within this decision will refer to the
   "RFP" for CEDS Phase II rather than to the Phase I contracts in which the
   relevant solicitation provisions are located.

   [3] We note that another section of the RFP stated that overall technical
   merit was more important (as opposed to significantly more important) than
   total evaluated price. Id. at 67.

   [4] As explained in detail below in footnote 20, during the course of the
   protest the Navy conceded certain evaluation errors. In light thereof,
   GD's proposal under the technical approach factor appears to merit a "very
   good" rather than an "outstanding" rating.

   [5] As with GD's rating under the technical approach factor, given the
   errors which the Navy concedes occurred, GD's proposal under the
   management approach and capabilities factor also appears to merit a "very
   good" rather than an "outstanding" rating.

   [6] In its original protest DRS also argued that the Navy had made a
   cardinal change to GD's contract by substantially changing the quantity of
   CEDS display consoles. DRS Protest, Nov. 19, 2007, at 29-30. At a hearing
   conducted by our Office as part of our review of the protest, DRS
   acknowledged that it had abandoned this issue. Hearing Transcript (Tr.) at
   257.

   [7] During the course of the protest, GD provided statements demonstrating
   that its expertise in separation kernel technology was the result of its
   long-time involvement in the design of information assurance systems such
   as encryption equipment. In 1999, as a result of an independent research
   and development (IR&D) project, GD employees filed a patent application
   for a mathematically analyzed separation kernel. [Deleted] GD Comments,
   Dec. 31, 2007, attach. A, Declaration of Bill Ross, at 4-5.

   [8] We acknowledge that our findings here are based largely on statements
   provided by the government and GD employees. However, we see no basis (nor
   does DRS provide one) to question the accuracy of the statements.

   [9] This briefing appears to have occurred in the context of GD being the
   current HAP program contractor.

   [10] POSIX, or portable operating system interface for Unix, refers here
   to the Navy's open software architecture initiative of ensuring that the
   CEDS system does not adversely affect any host application software with
   which it would interface. Tr. at 48-49.

   [11] DRS's initial proposal also did not identify the specific [deleted]
   separation kernel product in which it was most interested; instead, DRS
   listed [deleted] as potential options here. AR, Tab 5, DRS's Proposal,
   vol. II, Technical Proposal, at II-C-221.

   [12] The record indicates that the Navy's use here of the term "major
   weakness" was synonymous with "significant weakness," that is, a flaw in
   an offeror's proposal that appreciably increases the risk of unsuccessful
   contract performance. See FAR sect. 15.001.

   [13] The record reflects that the Navy conducted discussions in writing by
   sending each offeror a letter containing discussion questions on July 20;
   the agency also requested that offerors submit their discussion responses
   in writing by August 6. AR, Tab 14, Navy Discussions with DRS; Tab 16,
   Navy Discussions with GD. DRS submitted its discussion responses on August
   6, id., Tab 15, DRS Discussions Responses, and on August 13 the Navy
   confirmed an earlier notice to the offerors that FPRs were due by August
   16. DRS Protest, Dec. 31, 2007, exh. 9, Navy Email to Offerors. DRS argues
   that the agency's discussions with offerors did not close until the day on
   which the Navy confirmed the FPR closing date (August 13), rather than the
   date upon which discussion responses were due (August 6).

   [14] Acoustic waterfall display refers to the graphical representation of
   noise data (i.e., sonar) with respect to time. As time progresses, data is
   added to the top of the display screen and the existing data moves down.
   An improper display screen can flicker or flash over time, thereby causing
   eyestrain and/or headaches to the operator who is monitoring the acoustic
   waterfall display. Tr. at 170-73.

   [15] DRS originally alleged that GD's proposal also was noncompliant with
   the CEDS SRD requirement for hard-mounting. DRS Protest, Nov. 19, 2007, at
   22. The protester subsequently withdrew this aspect of its challenge to
   the agency's evaluation of GD's technical proposal. DRS Protest, Dec. 31,
   2007, at 20.

   [16] The RFP defined "unsatisfactory" as follows: the proposed approach
   indicates a lack of understanding of the program goals and the methods,
   resources, schedules and other aspects essential to performance of the
   program; numerous weaknesses and deficiencies exist; and the risk of
   unsuccessful contract performance is high. Id. at 74.

   [17] Additionally, the requirements verification test matrix portion of
   GD's proposal also represented that its display screens would comply with
   the SRD requirements regarding acoustic waterfall data. Id., app. B,
   Requirements Verification Test Matrix, at 5.

   [18] DRS also argues that GD's proposed approach to meeting the display
   screen acoustic waterfall data requirements will not work "as a matter of
   science." DRS Comments, Jan. 18, 2008, at 29-30. Even if we assume that
   GD's proposed approach here will ultimately not meet all display screen
   requirements without modifications, however, that does not mean that the
   agency misevaluated the offeror's proposal.

   [19] The record also reflects that the Navy's evaluation of proposals was
   even-handed in this regard: in those instances where DRS's proposal also
   agreed to comply with the SRD requirements without providing details as to
   how, the SSEB likewise did not find this to be a deficiency or weakness.
   Tr. at 143.

   [20] DRS's protest also raised other issues regarding the evaluation of
   its technical proposal, many of which were resolved by the agency's
   acknowledgment of error. In its report to our Office, the Navy conceded
   that because GD [deleted], the awardee's proposal should have received
   similar major weaknesses under the technical and management factors for
   failing to comply with the SRD requirement for a POSIX-compliant operating
   system. AR, Dec. 19, 2007, at 28. The Navy also acknowledges that, with
   regard to the System Test and Qualification subfactor, DRS's proposal
   should have received a rating of "outstanding" rather than "very good."
   Id. at 32. Lastly, the Navy admits that DRS's proposal should have
   received two additional major strengths--equivalent to those given to
   GD--for data rights and open architecture assessment tool. Id. at 35; Tr.
   at 261-62. In sum, the Navy acknowledges that DRS's proposal merited two
   additional major strengths and that GD's proposal should have received two
   major weaknesses. In accordance with the RFP's stated evaluation scheme,
   the acknowledged errors would appear to result in the GD and DRS proposals
   having equivalent overall ratings of "very good" under both the technical
   approach and management approach and capabilities factors.

   [21] DRS bases its figure here on the alleged higher cost of SRD-compliant
   display glass, the fact that three pieces of glass were required for each
   CEDS display console, and the requirement of 601 CEDS production units.
   DRS Protest, Nov. 19, 2007, at 28 n.4. The protester provides no further
   support for its computation.

   [22] DRS also originally asserted that the Navy's cost evaluation was
   flawed because GD had offered "cheap versions of major cost drivers"
   (e.g., trackball, keyboard, joystick) and that GD's labor rates for
   program services appeared understated. DRS Protest, Nov. 19, 2007, at 27.
   The agency addressed all aspects of its evaluation of GD's cost/price
   proposal in its report to our Office. AR, Dec. 19, 2007, at 35-39. As
   DRS's comments did not again raise these aspects of the Navy's cost
   evaluation of GD's proposal, see DRS Comments, Dec. 31, 2007, at 62-63, we
   regard these issues as abandoned. Remington Arms Co., Inc., B-297374,
   B-297374.2, Jan. 12, 2006, 2006 CPD para. 32 at 4 n.4.

   [23] Likewise, a realism analysis is not ordinarily part of an agency's
   price evaluation because of the allocation of risk associated with a
   fixed-price contract. AST Envtl., Inc., B-291567, Dec. 31, 2002, 2002 CPD
   para. 225 at 2. To the extent an agency elects to perform a realism
   analysis in the competition for a fixed-price or fixed-price incentive
   contract, its purpose is not to evaluate an offeror's price but to assess
   risk or to measure an offeror's understanding of the solicitation's
   requirements; the offered prices may not be adjusted as a result of the
   analysis. FAR sect. 15.404-1(d)(3); Puglia Eng'g of California, Inc.,
   B-297413 et al., Jan. 20, 2006, 2006 CPD para. 33 at 6.

   [24] The RFP's other CLINs were either not separately priced, or expressly
   not part of the agency's cost and price evaluation. RFP sect. B at 1-7.

   [25] As contemplated here, a fixed-price incentive (firm target) contract
   specifies a target cost, a target profit, a price ceiling (but not a
   profit ceiling or floor), and a profit adjustment formula. See FAR sect.
   16.403-1.

   [26] In its report to our Office, the Navy acknowledged certain
   computational errors in its final cost evaluation report. AR, Dec. 19,
   2007, at 38. DRS's protest does not challenge the propriety of these
   adjustments.

   [27] The RFP also established that, with regard to the past performance
   evaluation factor, the agency would assign a rating of "neutral" where the
   offeror was found not to have relevant past performance. Id. at 74.

   [28] The GD past performance information consisted of: one CPAR for GD
   business unit [deleted]; three CPARs for proposed subcontractor [deleted];
   two CPARs for proposed subcontractor [deleted]; one CPAR for proposed
   subcontractor [deleted]; and two past performance questionnaires each for
   GD, [deleted]. AR, Tab 6, GD Past Performance Information.

   [29] The MCS contract involved the design, development, fabrication,
   testing and fielding of a programmable, multi-channel, multiple
   independent levels of security (MILS) cryptographic device for the Navy's
   Virginia- and Seawolf-class submarines.

   [30] The agency also furnished a declaration from the SSEB chairman
   stating that the evaluators had considered only past performance
   information for the divisions of General Dynamics that would actually
   perform work under the CEDS contract, namely [deleted], as well as
   proposed subcontractors. AR, Tab 31, Declaration of SSEB Chairman, at 1.

   [31] In response to questioning from the agency, the SSEB chairman
   concluded that two of the strengths identified in the evaluation report
   regarding GD's past performance (i.e., "[t]he Contractor developed,
   produced and certified a MILS system on a submarine without benefit of
   required [government-furnished equipment/government-furnished information]
   GFE/GFI (Major)," and "[t]he contractor managed to certify a MILS system
   installed on a submarine without benefit of the required GFE/GFI which is
   perceived as a risk reducer to meeting the separation kernel requirement
   (Major)") derived from the contractor's rebuttal in the MCS CPAR. Tr. at
   366-69; AR, Tab 20, Final SSEB Report, at 30.

   [32] Further, the record reflects that this is more than just the faulty
   memory of a single individual: the SSEB chairman stated that his deputy
   also had no recollection of having seen or considered the MCS CPAR as part
   of the Navy's evaluation.

   [33] The Navy also contends that, with regard to the management approach
   factor, the offerors' proposals are substantially equal, although GD
   maintains a comparative advantage based on one additional minor weakness
   in DRS's proposal. AR, Feb. 14, 2008, at 2.