Dating service questionnaire 16a

In particular, Swets contends that it demonstrated that both its quoted products reached the RFQ-required information in fewer screen selections than the VAs count.

Swets also argues that the agencys evaluation of the number of screen selections required to reach the required information was unreasonable, disparate from how the agency treated Cox, and undocumented. As explained below, we have been unable to determine from this record how the agencys evaluators reached the number of screen selections for each of Swets databases at the product demonstration. As detailed above, the solicitation established that the VA would use a product demonstration as the primary means for determining the technical merit of each vendors offering(s), and that the agencys evaluation would be based on, among other things, the number of screen selections required to reach the desired information. When conducting the product demonstration, the agency evaluators and vendor representatives were not in the same physical location; rather, the demonstrations were conducted using screen-sharing software (e.g., Go To Meeting, or Web Join).

Based on these assessments, both Harberts and Clark/F-Ps proposals were rated as good under Area 3. After receiving revised proposals, the TEP increased Harberts rating to excellent, without explanation, other than the statement that the TEP accepted that the Offeror corrected the weakness and the ranking was changed to Excellent. However, evaluators and selection officials should reasonably consider the underlying bases for ratings, including the advantages and disadvantages associated with the specific content of competing proposals, in a manner that is fair and equitable and consistent with the terms of the solicitation.

See AR, Tab 9, TEP Final Consensus Evaluation Report, at 3. See MD Helicopters, Inc.; Agusta Westland, Inc., B-298502 et al., Oct. Indeed, the Federal Acquisition Regulation (FAR) requires that agencies sufficiently document their judgments, including documenting the relative strengths, deficiencies, significant weakness, and risks supporting their proposal evaluations.

In light of the contrary information provided by the protester, the agency evaluator worksheets that document only evaluation results are not sufficient to demonstrate the reasonableness of the agencys evaluation. Where an agency fails to document or retain evaluation materials, it bears the risk that there may not be adequate supporting rationale in the record for us to conclude that the agency had a reasonable basis for its source selection decision. The destruction of individual evaluator documents, however, does not render an agencys evaluation unreasonable per se; rather, we will consider the record adequate if the consensus documents and source selection decision sufficiently document the agencys rationale for the evaluations. At no time did any of the team state their ratings were not included. The material of the bags is crinkly and may impact barcode readability through the mail stream. Further, these summary conclusions in the source selection document are followed by detailed, comprehensive evaluator findings with regard to each of the four sizes of sample bags submitted by M-Pak, Custom Pak, Star Poly, and the other vendors under each of the five evaluation categories.

In the final analysis, it may well be that the agency had a reasonable basis for concluding, notwithstanding this significant difference in the proposed staffing of DS2 and M7, that the proposals nonetheless were technically equal.

Here, the record does not provide the underlying bases for the TEPs decision to increase the ratings of Harberts proposal to excellent after revised proposals.

As noted above, the TEP final consensus evaluation report merely states that the TEP accepted that the offeror corrected the weakness and the ranking was changed to Excellent without providing additional explanation.

In most instances the agency evaluators agreed with each other about the number of screen selections needed to reach the desired information for each of the required items. The agency argues that the evaluators agreed to the methodology for counting screen selections prior to conducting the product demonstrations.

AR, Tab 9, Evaluator Worksheets, May 16, 2014, at 1-30. Swets asserts that the agencys evaluation of Swets screen selections was inaccurate and unreasonably high. Swets also provided screen-by-screen walkthroughs for demonstration item c for Lexi-Comp (three screen selections as compared to the VAs count of six screen selections), and for demonstration item l for F&C (three screen selections as compared to the VAs count of six screen selections). In this regard, the agency states, in its response to the protest, that the evaluators counted the number of screen selections as each screen was being clicked on and counted transitioning to the next screen, which included all drop‑down menus and sub-menus. 11, 2014, at 4; AR, Tab 12, Declaration of VA Evaluator J. Thus, the agency asserts that the evaluators accurately counted vendors screen selections based on this established methodology.

Search for dating service questionnaire 16a:

dating service questionnaire 16a-11dating service questionnaire 16a-30

Specifically, TEG contends that the agency did not determine whether the submitted past performance references of these offerors were relevant and permitted these offerors to provide less than the required number of references. The critical question is whether the evaluation was conducted fairly, reasonably, and in accordance with the solicitations evaluation scheme, and whether it was based on relevant information sufficient to make a reasonable determination of the offerors past performance. An agency is required to consider, determine and document the similarity and relevance of an offerors past performance information as part of its past performance evaluation. For example, as to the relevance of Blue Laws past performance, the evaluation documentation only states: Blue Law identified their relevant past performance and was rated Exceptional by two, Excellent by six, and Good by one of their Past Performance Surveys submitted by Contracting Officers that have worked directly with them.

Leave a Reply

Your email address will not be published. Required fields are marked *

One thought on “dating service questionnaire 16a”

  1. Door middel van wetenschap en innovatie zorgen we ervoor dat onze klanten in vrijwel elke sector aan de behoeften van de samenleving kunnen voldoen, nu en in de toekomst.