Loren Data's SAM Daily™

fbodaily.com
Home Today's SAM Search Archives Numbered Notes CBD Archives Subscribe
FBO DAILY ISSUE OF NOVEMBER 30, 2007 FBO #2195
MODIFICATION

A -- Amendment to BAA W91CRB-08-R-0020

Notice Date
11/28/2007
 
Notice Type
Modification
 
NAICS
541712 — Research and Development in the Physical, Engineering, and Life Sciences (except Biotechnology)
 
Contracting Office
RDECOM Acquisition Center - Aberdeen, ATTN: AMSSB-ACC-A, 4118 Susquehanna Avenue, Aberdeen Proving Ground, MD 21005-3013
 
ZIP Code
21005-3013
 
Solicitation Number
W91CRB08R0020
 
Response Due
1/8/2008
 
Archive Date
3/8/2008
 
Point of Contact
Tyrone M. Knight, 410 278-2465
 
E-Mail Address
Email your questions to RDECOM Acquisition Center - Aberdeen
(tyrone.m.knight@us.army.mil)
 
Small Business Set-Aside
N/A
 
Description
Amendment 1: The Broad Agency Announcement (BAA) sponsored by the U. S. Army, Program Executive Office (PEO) Soldier, Product Manager Soldier Sensors and Lasers (PM-SSL), Bldg 325, 10170 Beach Rd., Ft. Belvoir, VA 22060-5820 and published by the U.S. Army Research, Dev elopment, and Engineering Command is amended as follows: The Soldier Wearable ? Gunfire Detection System (SW-GDS) technology development program is scheduled to run from Fiscal Year 2008, with a total budget of $669,000.00 and end with a system demonstration in the 2nd Quarter of Fiscal Year 2009. Interested parities are invited to submit proposal. The Proposal shall be evaluated based on the criteria identified paragraph 9.0 in the Statement of Objectives Below. Proposals will be accepted until 08 January 2008 at 12:00 P.M. Responses will be direc ted towards the RDECOM Acquisition Center, Aberdeen Contracting Division, 4118 Susquehanna Avenue, Aberdeen Proving Ground, MD. Contracting Point of Contact is: Tyrone M. Knight, phone (410) 278-2465, Fax (410) 306-3850, email: tyrone.m.knight@us.army.mil Statement of Objectives 1.0 Summary The U.S. Army, PEO Soldier, Product Manager Soldier Sensors and Lasers (PM-SSL), Bldg 325, 10170 Beach Rd., Ft. Belvoir, VA 22060-5820 is soliciting for a technology development program, Soldier Wearable ? Gunfire Detection System (SW-GDS). The SW-GDS mus t detect sniper fire, provide alert data containing bearing and range to the source of fire, and visually display the results on the individual soldier?s display. All components comprising a single system are to be worn by a single soldier. The single sold ier is referenced as ?the user? in this document. The following objectives of the SW-GDS program shall be used by the offeror?s jointly with the other appropriate sections of the BAA when preparing proposals. 2.0 Background An Operational Needs Statement (ONS) from the 101st Airborne was received in April 2003 requiring a portable, flexible, robust, and accurate means of detecting hostile fire. The objective from the ONS was to develop a low-cost, soldier-worn, micro-acousti c array that instantly alerts a soldier to the origin of hostile enemy fire. 3.0 SW-GDS Objective To develop current technology into a system that meets the performance requirements as stated below, under the stated conditions, and to demonstrate end-to-end performance of the system on government test ranges. 4.0 SW-GDS Requirements (NOTE: Some requirements are stated with both Threshold and Objective requirements because some requirements are more difficult to achieve than others. The Threshold is the minimum the Government will accept for that particular requirement. Requirements no t having both stated are considered thresholds.) 4.1 Mandatory Requirements (NOTE: The two (2) mandatory requirements immediately below are Pass/Fail. If both are not met, the proposal will not be given any further consideration.) 4.1.1 Stand-alone, Single-Soldier Solution: The alert data displayed on a soldier?s individual display cannot rely on sensor data from other surrounding sensors. 4.1.2 Placement: No component of the system can be mounted on the helmet. 4.2 General Requirements 4.2.1 Detection field of view of 360 degrees in azimuth. 4.2.2 Detect weapon variants firing 5.56mm and 7.62mm ammo sizes. 4.2.3 Detect and provide alert data for both single shot and short burst (3 rounds spaced 0.5 seconds apart) from single shooter. 4.2.4 Detect and provide alert data for multiple shooters each in different locations each firing 0.5 seconds apart. 4.2.5 System response time shall be less than 0.5 second. This response time is measured from the time the muzzle blast passes the sensor to the time the soldier receives an alert on the display. 4.2.6 System Size, Weight, and Power 4.2.6.1 Size 4.2.6.1.1 Threshold: No more than 12 cubic inches with a display volume of 5 cubic inches or less. 4.2.6.1.2 Objective: No more than 9 cubic inches with a display volume of 3 cubic inches or less. 4.2.6.2 Weight 4.2.6.2.1 Threshold: Total weight not to exceed 12 ounces. 4.2.6.2.2 Objective: Total weight not to exceed 8 ounces. 4.2.6.3 Power 4.2.6.3.1 Threshold: 12 hours minimum continuous use with common commercially available battery (e.g. AA) in a 70 degree (F) environment. 4.2.6.3.2 Objective: 24 hours minimum continuous use with common commercially available battery in a 70 degree (F) environment. 4.2.7 Display 4.2.7.1 Mode: User must be able to select display lighting mode type (day, night, and blackout). 4.2.7.2 Low Battery Indicator: Provide user with an indication the batteries need to be replaced. 4.2.7.3 Bearing: Must utilize a clock-face notation using soldier?s chest as 12 o?clock reference. 4.2.7.4 Range: 4.2.7.4.1 Threshold: Near Gunfire (less than 250 meters) and Far Gunfire (greater than 250 meters) 4.2.7.4.2 Objective: The actual calculated range. 4.2.7.5 Updates: Alert data must reorient on the display to compensate for any translational and/or rotational motion the soldier may have made since the last shot without user intervention. The update rate shall be once a second (1Hz.) or faster. 4.2.7.6 Clear Screen: User shall be able clear the display of all shots. 4.2.8 Audio Alert 4.2.8.1 Audio Device: Alert data is provided to the user through a single ear piece (ear-bud). 4.2.8.2 Bearing and Range: A synthesized voice, in English, will verbally communicate alert data on the audio device. For example, ?Far shot at 3 o?clock.? 4.2.8.3 Volume: The user must have the ability to adjust the volume of the audio. 4.2.8.4 Updates: The user must have the ability to recall, at user?s initiation, last alert data that has been compensated for any translation and/or rotation the soldier might have made since the last shot. 4.2.8.5 Independence: The audio alert will function properly without the display being attached. 4.2.9 Must be able to operate in both Open Terrain and Military Operations Urban Terrain (MOUT) environments. 4.2.9.1 Threshold: Current environment determined by manual selection. 4.2.9.2 Objective: Current environment determined by automatic selection. 4.2.10 Environmental / Ruggedization 4.2.10.1 Low Temperature (Mil Std-810F Method 502.4 Procedures I & II) 4.2.10.1.1 Storage: 4.2.10.1.1.1 Threshold: -20 Degrees Fahrenheit 4.2.10.1.1.2 Objective: -40 Degrees Fahrenheit 4.2.10.1.2 Operating: 4.2.10.1.2.1 Threshold: 0 Degrees Fahrenheit 4.2.10.1.2.2 Objective: -20 Degrees Fahrenheit 4.2.10.2 High Temperature (Mil Std-810F Method 501.4) 4.2.10.2.1 Storage: 4.2.10.2.1.1 Threshold: 140 Degrees Fahrenheit 4.2.10.2.1.2 Objective: 160 Degrees Fahrenheit 4.2.10.2.2 Operating: 4.2.10.2.2.1 Threshold: 130 Degrees Fahrenheit 4.2.10.2.2.2 Objective: 150 Degrees Fahrenheit 4.2.10.3 Humidity (Mil Std-810F Method 507.4) 4.2.10.4 Rain (Mil Std-810F Method 506.4 Procedure I) 4.2.10.5 Sand & Dust (Mil Std-810F Method 510.4 Procedure I) 4.2.10.6 Shock (Drop Test) (Mil Std-810F Method 516.5) 4.2.10.6.1 Threshold: No damage or degraded performance resulting from a single 3 foot drop on concrete. 4.2.10.6.2 Objective: No damage or degraded performance resulting from a single 4 foot drop on concrete. 4.2.10.7 Radiated Emissions (Mil Std-461E (RE102)) 4.2.10.8 Radiated Susceptibility (Mil Std-461E (RS103)) 4.2.10.9 Electrostatic Discharge (ESD): Proper packaging materials and design techniques must be utilized in the system design as to protect against any static electricity generated by the user?s clothing. 4.3 Performance Requirements 4.3.1 Operational Ranges 4.3.1.1 Threshold: 4.3.1.1.1 Open Terrain Environment ? Operational detection range out to 400 meters 4.3.1.1.2 MOUT (urban) Environment ? Operational detection range out to 200 meters. 4.3.1.2 Objective: Exceed one or both environment operational range thresholds. 4.3.2 Target Detection Probability 4.3.2.1.1 Threshold: 80% positive detection at wind speeds up to 10 mph and 60 dB background noise. 4.3.2.1.2 Objective: of 90% positive detection at wind speeds up to 10 mph, 85 dB background noise. 4.3.3 Bearing Error 4.3.3.1.1 Threshold: The calculated error must be less than ?15 degrees of actual bearing. 4.3.3.1.2 Objective: The calculated error must be less than ?7.5 degrees of actual bearing. 4.3.4 Range Error 4.3.4.1.1 Threshold: The calculated error must be less than ?15% of actual range. 4.3.4.1.2 Objective: The calculated error must be less than ?10% of actual range. 4.3.5 Human Factor Dynamics 4.3.5.1 Threshold: Detect and provide alert data while walking fast (4.5 mph) along a general horizontal (x-y) plane and/or rotating at speeds of 2 revolutions per second about the vertical (z) axis. 4.3.5.2 Objective: Exceed one or both speed thresholds. 4.3.6 Data Overload: The system must not fail (stop detecting, lock-up, crash, etc) if the system gets overwhelmed with friendly and/or enemy gunfire. 4.4 Optional Requirements (Objectives) 4.4.1 Be able to forward soldier location and gunfire location data to higher headquarters using current networks. 4.4.2 Be able to identify caliber types (i.e. 5.56mm vs. 7.62mm) 4.4.3 Be able to accurately detect and provide detection data for silenced weapons. 5.0 Performance Verification, Analysis, and Report 5.1 Performance 5.1.1 A minimum of one (1) Open Terrain live-fire test, 3 days long, at a government test range to verify open terrain performance. 5.1.2 A minimum of one (1) MOUT live-fire test, 3 days long, at a government test range to verify MOUT performance. A period of time during this test will be set aside to serve as a demonstration for Government invites. 5.1.3 The offeror will perform a data analysis following each live-fire test. The offeror will write a report following each analysis and it will be presented to the Government. 5.2 Environmental 5.2.1 Testing will be accomplished by live-fire testing and/or by (on a lab bench) stimulation. A Contractor/Government IPT will determine the most effective method(s) to accomplish this verification. 5.2.2 The offerer will explain any failure(s) of any test(s) in a written report. Any improvements that could be made shall be included in the report as well. 6.0 Government Furnished Equipment and Facility (GFE/GFF) 6.1 Firing Ranges - including weapons, personnel, ammunition, and MET (meteorological data) 6.2 Environmental Test Center ? including test chambers/fields and personnel. 7.0 Hardware Prototypes 7.1 A minimum of ten (10) prototype systems will be built. 7.2 One (1) set of installation, calibration, training, and maintenance manuals for the system. 7.3 All required software necessary to make the system perform will be included. 7.4 All prototype units, manuals, and software will be delivered to the Government at contract end. 8.0 Provide following Contract Data Requirements (CDRL): 8.1 Contractor?s Progress, Status and Management Report, DI-MGMT-80227 8.2 Safety Assessment Report (SAR), DI-SAFT-80102B, to support field testing at a government facility, if required by facility. 8.3 Test Plans, DI-NDTI-80566 8.4 Test/Inspection Report, DI-NDTI-80809B 8.5 Presentation Material, DI-ADMIN-81373 8.6 Interface Design Description (IDD), DI-IPSC-81436A 8.7 Software Design Description (SDD), DI-IPSC-81435A 8.8 System/Subsystem Design Description (SSDD), DI-IPSC-81432A 8.9 Final Report, DI-ADMN-80447 9.0 Evaluation Criteria and Rating Method Each proposal will be evaluated on the merit and significance of the specific proposal. Selection of the proposal for award will be based on an evaluation of which proposal is the most advantageous to the Government considering technical and management re lative merit in accordance with the evaluat ion criteria, cost, best value considerations, and availability of funds. 9.1 Factors: Three (3) factors will be used in the evaluation of the proposals. 9.1.1 Factor 1: Technical Area ? Contains three (3) Sub-factors 9.1.1.1 Sub-factor A: Technical Merit ? The proposals will be evaluated on overall technical feasibility and may include maturity level, capabilities and limitations. 9.1.1.2 Sub-factor B: Gunfire Detection System Experience ? The offeror will be evaluated on their ability and their past experience with Gunfire Detection System development which may include portable or vehicle-based applications. Development work specif ically related to Soldier Wearable is a plus. 9.1.1.3 Sub-factor C: Qualifications ? The offeror will be evaluated on its qualifications to perform the proposed effort. Qualifications may include key technical personnel, adequacy of laboratories, and other facilities. 9.1.2 Factor 2: Management Area ? Contains three (3) Sub-factors 9.1.2.1 Sub-factor A: Overall Planning and Scheduling - The offeror?s approach will be evaluated based on the overall planning and scheduling in the offeror?s proposal. 9.1.2.2 Sub-factor B: Expenditure Control ? The offeror?s approach for controlling labor hours, material expenditures, and redundant work. 9.1.2.3 Sub-factor C: Past Performance ? The past performance of the offeror and first tier subcontractor will be evaluated in schedule performance, cost control, general responsiveness to contract requirements, customer satisfaction, and customer focus. 9.1.3 Factor 3: Cost/Price Area - Contains two (2) Sub-factors 9.1.3.1 Sub-factor A: Cost Realism ? The proposals will be evaluated on the likelihood the offeror?s proposal can be accomplished at the cost/price proposed. 9.1.3.2 Sub-factor B: Overall Cost - The proposals will be evaluated to determine the overall cost to the government. Other costs to the Government outside of contract award will be included in the overall cost. Excessive Government test center costs would be an example of increased overall cost. 9.2 Rating Method: The evaluation of the proposals will be based on three (3) factors each having a percentage weighting given below. Relative Importance of Evaluation Criteria: Technical is more important than Management and Management is more important then Cost/Price. Where appropriate, color ratings will be applied to depict how well the offeror?s proposal meets the evaluation criteria. 9.2.1 Factors: Factor 1: Technical Area (65% of total weight) o Sub-factor A: 60% weight o Sub-factor B: 30% weight o Sub-factor C: 10% weight Factor 2: Management Area (25% of total weight) o Sub-factors are weighted equally. Factor 3: Cost/Price Area (10% of total weight) o Sub-factors are weighted equally. 9.2.2 Adjectival Ratings: Adjectival merit and risk ratings will be used, where appropriate, in the evaluation of the three (3) factors and their sub-factors. The ratings are defined in Appendix A. 10.0 Instruction, Conditions, Notices to Offerors 10.1 Industry and Other Government Agencies (OGA) are invited to submit proposals addressing the technical approach, schedule, and estimated budget identified herein. A Statement of Work (SOW) shall be included in the offeror?s response and shall be suita ble for contractual incorporation. The SOW will be negotiated and accepted by PM-SSL prior to contract award. 10.1.1 Proposal Content: The proposal shall include a full discussion of the scope, nature, rationale for the technical approach and methodology, expected results, and any state-of-the-art technology advancement. The proposal shall also include a summary description of management planning and control, configuration management, and product assurance. Any patent rights, limited rights to technical data, restricted rights in computer software, all need to be identified and included in the proposal. A discussi on of the offeror?s capabilities and qualifications with addit ional information related to key personnel and facilities should be included in the proposal. The proposed cost should take into account a period of performance of twelve (12) months. 10.1.1.1 The offeror shall breakdown the cost for each subtask within the SOW. 10.1.1.2 Offerors proposing subcontracts to perform portions of the SOW should clearly identify the specific tasks for which they plan to utilize subcontractors, as well as the method and level of integration/coordination between the prime Contractor and a ll proposed subcontractors, and the expected advantages of such an approach. The offeror shall breakdown the cost for subcontractor tasks. 10.1.1.3 The proposal should include backup analyses covering technical issues and current IR&D as needed. 10.1.1.4 The proposal shall indicate any Government or Commercial data that would be necessary for the offeror to meet the SW-GDS program requirements and to minimize risk. 10.1.1.5 Proposals that reference similar gunfire detection contracts, either past or current, the offeror shall provide full Point-of-Contact information at the Government agency where the work was monitored. 10.1.2 Evaluation Process: Proposals will be accepted until 08 January 2008 at 12:00 P.M. Responses will be directed towards the RDECOM Acquisition Center, Aberdeen Contracting Division, 4118 Susquehanna Avenue, Aberdeen Proving Ground, MD. Within 4 weeks , each proposal will be evaluated based on the information and requirements stated herein and select the best candidate. The selection will be forwarded to the RDECOM Acquisition Center, Aberdeen Contracting Division, Aberdeen Proving Ground (APG) Contra cting Office with PM-SSL issuing a contract through the APG Contracting Office to the selected candidate. General information and specific program information relevant to the Acquisition will be made available on the Army Single Face to Industry (ASFI) Ac quisition Business Web Site. The URL for the ASFI is https://acquisition.army.mil. 10.1.3 Proposal Submission: The Government intends to conduct all communications including proposal submission electronically. If electronic submission is not feasible, offerors are invited to submit proposals (original plus 6 copies) to Mr. Tyrone M. Kn ight Contract Specialist, US Army Research, Development & Engineering Command Acquisition Center, 4118 Susquehanna Ave, Aberdeen Proving Grounds, MD 21005, telephone: (410) 278-2465, Facsimile: (410) 306-3850, Email: Tyrone.M.Knight@us.army.mil. The propo sals belonging to offerors not selected for an award will be disposed of in a manner that protects proprietary data. All proprietary material should be clearly marked and will be held in the strictest confidence. The document page limit are as follows: Pr oposals may not exceed 30 pages, must be single-sided with double-spaced text, page size no larger than 8 ? x 11 inches, font size no smaller than 12 point, one-inch left and right margins, 1.25 inch top margins, and 1.0 inch bottom margins on all pages. Please include contact information (i.e. company name, point of contact, e-mail, telephone, and mailing address). POC for all technical inquiries is Mr. Tyrone M. Knight, telephone: (410) 278-2465, Facsimile: (410) 306-3850, Email: Tyrone.M.Knight@us.army .mil Prospective offerors and other interested parties are invited to request additional information or direct questions to Mr. Tyrone M. Knight, telephone: (410) 278-2465, Facsimile: (410) 306-3850, Email: Tyrone.M.Knight@us.army.mil 10.2 The Government does not intend to pay for the information solicited. The Government may publish the questions, comments, and responses without identification of the source on ASFI.. Appendix A: Adjectival Ratings Merit Ratings ColorAdjectival RatingDefinitionBlueExcellentProposal demonstrates an excellent understanding of requirements; The offeror?s approach significantly exceeds requirements; The offeror has exceptional strengths that will sig nificantly benefit the Government. GreenGoodProposal demonstrates a good understanding of requirements; The offeror?s approach exceeds requirements; The offeror has one or more strengths that will benefit the Government. YellowAcceptableProposal demonstrates an acceptable understanding of requirements; The offeror?s approach meets requirements; The offeror has few or no strengths; The offeror?s proposal provides an acceptable solution. PinkMarginalProposal demonstrates little understanding of requirements; The offferor?s approach marginally meets requirements. Offeror exhibits more weaknesses than strengths. Requirements possibly could be met with minor changes to the proposal. RedUnacceptableProposal fails to demonstrate an understanding of requirements; The offeror?s approach only meets a few requirements; Requirements possibly could be met with major changes to the proposal. Appendix A: Adjectival Ratings (continued) Merit Risk Ratings ColorAdjectival RatingDefinitionGreenLow RiskThe proposal?s approach has weaknesses that can potentially cause little disruption of schedule, cause an increase in cost, or a degradation of performance. Normal contractor effort and normal monitoring by the Government will probably minimize any dif ficulties.YellowMedium RiskThe proposal?s approach has weaknesses that can potentially cause some disruption of schedule, cause in an increase in cost, or a degradation of performance. However, special contractor emphasis and monitoring by the Government will probably minimize d ifficulties.RedHigh RiskThe proposal?s approach has weaknesses that have the potential to cause serious disruption of schedule, cause an increase in cost, or a degradation of performance even with special contractor emphasis and monitoring by the Government. Appendix A: Adjectival Ratings (continued) Past Performance Risk Ratings ColorAdjectival RatingDefinitionGreenLow Risk Based on offeror?s past performance record, essentially no doubt exists that the offeror will successfully perform the required effort.YellowMedium RiskBased on the offeror?s past performance record, some doubt exists that the offeror will successfully perform the required effort. RedHigh RiskBased on the offeror?s past performance record, extreme doubt exists that the offeror will successfully perform the required effort. GrayUnknown RiskNo relevant performance record is identifiable upon which to base a meaningful performance risk prediction. Offeror?s feedback and a search were unable to identify any relevant past performance information for the offeror or key team members or subcont ractors or their key personnel. This is neither a negative or positive assessment. ALL POTENTIAL OFFERORS SHOULD BE AWARE THAT DUE TO UNANTICIPATE D BUDGET FLUCTUATIONS, FUNDING IN ANY OR ALL AREAS CAN CHANGE WITH LITTLE OR NO NOTICE.
 
Place of Performance
Address: RDECOM Acquisition Center - Aberdeen ATTN: AMSSB-ACC-A, 4118 Susquehanna Avenue Aberdeen Proving Ground MD
Zip Code: 21005-3013
Country: US
 
Record
SN01459451-W 20071130/071128223644 (fbodaily.com)
 
Source
FedBizOpps Link to This Notice
(may not be valid after Archive Date)

FSG Index  |  This Issue's Index  |  Today's FBO Daily Index Page |
ECGrid: EDI VAN Interconnect ECGridOS: EDI Web Services Interconnect API Government Data Publications CBDDisk Subscribers
 Privacy Policy  Jenny in Wanderland!  © 1994-2024, Loren Data Corp.