Smart Procurement

This guidance for governments on protecting civil liberties when procuring technology identifies essential guardrails for ensuring that risks to privacy, civil liberties, and civil rights are adequately considered and addressed whenever the government seeks to procure or utilize technology that affects people. Heightened connectivity, more sophisticated cameras and sensors, and the emergence of powerful artificial intelligence (“AI”) applications are a few of the factors driving governments across the country to seek out new technologies, often times without considering potentially grave implications for the public interest.

1. APPENDIX A: VENDOR ASSESSMENT QUESTIONNAIRE

A.APPENDIX A: VENDOR ASSESSMENT QUESTIONNAIRE

A.

VENDOR ASSESSMENT QUESTIONNAIRE (“VAQ”)

Vendors providing government entities with technology products and/or services that impact the public, use applications or solutions that carry substantial inherent risk (e.g., AI, algorithmic decisionmaking, black box systems, surveillance, etc.) should be evaluated separately for vendor risk. After assessing inherent risk, government entities should integrate questions found in this Appendix (as appropriate) into RFPs (for example, added into the “Requirements” section of government entity RFPs) and other documents for solicitating vendor bids. The vendor questions that fall within five (5) distinct areas:

  • Background: General questions about the vendor (some of which already may be solicited in RFPs)
  • AI: Questions related to the use of any AI or algorithmic decision-making technology in the project
  • Privacy: Questions focused on identifying vendor issues with respect to preserving privacy
  • Security: Questions that broadly cover security concerns raised when a vendor is handling or managing personally identifiable information (including highly sensitive information)
  • Governance: Questions that relate to data management practices generally, and most importantly, destruction of data

Practice Tip: These questions are meant to serve as a starting point for building a meaningful VAQ framework. Government entities should err on the side of asking more questions of the vendor rather than fewer. Moreover, responses to these questions will help inform any inherent risk assessment, and vice versa.

BACKGROUND

B1

Vendor name.
B2 Name of contracting department(s) or agency.
B3 Name of product or service offered.
B4 Project name or designation.
B5 Identify relevant vendor contact(s) for this project.
B6 What is the targeted launch date for this project?
B7 What is the intended duration of the project?
B8 Provide a brief description of product or service, including
B9 Have you previously worked with the government agency or entity that issued this RFP? If so, explain.
B10 Have you worked with other municipalities or local governments, and if so, which, and in connection with which project(s)?
B11 How long has the product or service been in use on the open market?
B12 Identity of any subcontractors or affiliates to be engaged on the project.
B13 Please specify each source for and type of information or data set(s) that you anticipate accessing, collecting, using, processing, or otherwise handling as part of this project (e.g., public records, internal municipal information, PII, videographic materials, metadata, etc.).
B14 Please describe any specialized hardware that will be used as part of this project.
B15 Will this project involve the use of any surveillance or monitoring technologies?
B16 Does the project involve the use of AI, computer vision, machine learning, algorithmic or automated decision making (e.g., profiling), image/object recognition, speech and text analytics, predictive modeling, and/or other forms of automation or AI-related technologies?
B17 Does this project require make use of open source software?

AI

A1

Please describe any form of AI ("AI application," including, but not limited to, computer vision, machine learning, algorithmic or automated decision making (e.g., profiling), image/object recognition, speech and text analytics, predictive modeling, and/or other forms of automation or AI-related technologies) that will be used in connection with any PII accessed, collected, or processed for this project.

A2

If an AI application will be used, please describe measures taken to prevent algorithmic or any other form of bias, as such measures relate to:

A2a

training of individual developers on unconscious bias;

A2b

any mechanisms in place to prevent potential psychological, social, emotional, personal, and cultural contexts or biases of the humans developing the AI application and training data set from resulting in an AI application that has embedded the human bias of its creators

A2c

the type of data being used to train the AI application, including where it is being sourced from and if it is "full spectrum" data, along with any efforts taken to ensure any training data set that was used was representative and unbiased;

A2d

thresholds triggered when aberrant or biased feedback is detected;

A2e

preventing unfair, inequitable, or discriminatory outcomes for any individual or class of individuals.

A3

Are there any possible use cases associated with the AI application by which PII be used in a manner that can impact any individual’s personal interests, rights, or liberties, including, but not limited to, an individual's economic, educational, or employment interest(s), criminal record, health, access to goods and services (whether private or public), mobility, or any other sensitive category? If so, explain.

A4

Has the AI application been assessed for its potential to cause unintended consequences that may harm individuals, including loss of opportunity, economic loss, social detriment, or loss of liberty? If so, please describe the nature of that assessment, the outcome, and any steps taken to mitigate potential harm(s).

A5

Is the AI application interpretable (i.e., transparent in its operation) and auditable (i.e., explainable in its behavior)? If so, please describe tools used to interpret and audit the AI application, including through regular reporting.

A6

Are there any possible use cases associated with the AI application by which PII be used in a manner that can impact any individual’s personal interests, rights, or liberties, including, but not limited to, an individual's economic, educational, or employment interest(s), criminal record, health, access to goods and services (whether private or public), mobility, or any other sensitive category? If so, explain.

A7

How will performance of the AI application be monitored on an ongoing to ensure no bias?

A8

What safeguards are put in place to ensure that the AI application is being used properly and working as intended?

A9

Will the AI application use PII in a way that it has not previously been used by the contracting agency or department to make an inference or decision about any individual or class of individuals? If so, how?

A10

Is there any possibility that the use of the AI application may disadvantage those who lack English fluency?

A11

If a learning model was used, please describe the nature of that model (i.e., supervised or unsupervised).

A12

What is the level of autonomy for the AI application? (H - fully autonomous system, M - autonomous system, human escalation, L - human decision informed by system)

A13

Will there be a "kill-switch" in place with conditions by which the AI application should be abandoned until remediated? If so, please describe.

A14

Will there be human overrides for decisions or conclusions reached by the AI application? If so, please describe.

A15

Will the project leverage natural language processing (speech recognition, NLU, NLG) or convolutional neural networks?

A16

How does the AI application prioritize critical and high-risk data that require immediate attention from a human?

A17

Please describe any safeguards in place to protect the integrity of the AI application (including all underlying algorithms and data sets used) from human tampering or manipulation.

A18

If available, please provide copies of any your policies or procedures relating to algorithmic bias and de-biasing for review.

PRIVACY

P1 Will this project require you to access, collect, maintain, analyze, disclose, review, destroy, use, and/or process, in any manner, any of the following types of data?
P1a Public information or records? If so, specify.
P1b Internal use information maintained by the contracting agency or department? If so, specify.
P1c Any personally identifiable information ("PII")? If so, specify.
P1d Highly sensitive PII, such as geolocation, biometric identifiers, social security or other government ID numbers, etc.? If so, specify.

P1e

Will any PII be collected from individual(s) under the age of 18?
P1f Are there any data elements, not identified in response to Items 1a-1d, that will be collected or used?
P1g For each data element identified in response to Item 1, please confirm that all of those data elements are necessary to fulfill the project purpose.
P2 For each type of data specified in response to Item 1 that will be collected or used, please identify:
P2a the manner(s) in which each type of data will be collected or used (e.g., tools, software, hardware, data sets, other sources, etc.);
P2b whether data will be collected from individuals directly or another party;
P2c whether individuals whose PII is collected or used will be provided notice prior to collection and, if notice will be provided, please provide a copy of the notice;
P2d whether PII is being collected or used with or without the consent of individuals;
P2e whether the data will be de-identified, anonymized, or pseudonymized
P3 If PII is being collected or used, will individuals be able to opt out of the collection or use of their PII?

P4

Will PII be collected from specifically identifiable communities, neighborhoods, areas, groups, and/or other demographics? If so, specify.
P5 Will PII be shared with any third party? If yes, please respond to 5a-5c.
P5a Please identify the any third party with whom PII will be shared;
P5b Please specify which data element(s) will be disclosed to each third party;
P5c Please identify the purpose for which each data element(s) will be disclosed to each third party;
P5d Please confirm whether a contractual agreement is in place with each third party that will be receiving PII
P6 Will PII be used in a way that it has not been used before to make an inference or decision about any individual? If so, how?
P7 Will you be selling, monetizing, or utilizing any data elements collected or received during the project for any purpose not set forth in the agreement with the contracting public agency or department?
P8 Will PII be de-identified or anonymized? If so, please explain.
P9 Will PII collected as part of this project be comingled with other personal information already stored by the contracting agency or department?
P10 If the project involves surveillance, please describe the circumstances of the surveillance (e.g., open or secret, means of surveillance, etc.).
P11 What procedures would you follow when receiving and responding to a subpoena (of any kind) seeking information collected or stored by you in connection with any engagement with a public agency or department?
P12 What measures are in place, if any, to process and adjudicate challenge(s) to the accuracy of PII collected or stored in connection with this project?
P13 Please provide copies of your privacy policies for review.

SECURITY

S1

Has a security / cybersecurity risk assessment been completed for this project? If so, please attach a copy. If not, please explain why.
S2 Is there a formal information security policy and program in place? If so, please provide relevant documentation.
S3 Are employees required to attend regular security awareness and training sessions?
S4 Please describe all controls (including physical, technical, and access-based) in place to ensure confidentiality, integrity, and security of any PII involved in the project.
S5 Will PII stored in connection with this project be encrypted? If so, please specify the method(s) of encryption and the circumstances under which information is encrypted (e.g., at rest, in transit, etc.).
S6 Is the information used by this solution structured, unstructured, or a combination?
S7 Is there a formal incident management policy and program in place?
S8 Have you had any data breaches in the past five (5) years? If so, please provide a detailed explanation for each breach.
S9 Will the product or service used in this project result in any anticipated interference or disruption to existing municipal systems, or any unanticipated alteration to existing municipal data sets? If so, please explain?
S10 Is the purpose of the project acquisitional, transactional, analytical, or something else entirely?
S11 If this is an analytical solution, what is the nature of the analytics (e.g., descriptive, predictive, prescriptive, etc.)?
S12 Will metadata be used in connection with underlying data collected or used for this project? If so, please describe the metadata management plan.
S13 What security audit log requirements for the system been determined and documented?
S14 Please describe data storage and communications protection requirements.
S15 To the extent necessary, how are security updates developed and delivered?
S16 Please provide a general description of your data loss prevention (DLP) policies and practices.
S17 Please provide a general description of your operational recovery and disaster recovery strategies.
S18 Please provide a general description of your security assessment and vulnerability scanning processes.
S19 Will the project result in any additional security exposure to municipal systems or data sets? If so, please explain.
S20 Do you maintain cyber insurance policy? If so, please describe coverage conditions and extent of coverage.
S21 Please provide any additional information relating to your cybersecurity policies, practices, and procedures, that would be helpful to aid in this review.
S22 Will there be safeguards in place to effectively identify where and how data was used or accessed in the event of a security breach? If so, please describe.
S23 Please provide copies of your security policies for review.

GOVERNANCE

G1 Please provide a summary of the data flow throughout your and any connected or related systems, device(s), application(s), tool(s), and/or databases as part of this project (alternatively, you may provide a data flow diagram). The summary (or diagram) should identify all components (internal and external) involved, together with an identification of each data element passing between systems and databases
G2 If any PII will be stored, where (geographically) will the data be stored?
G3 If any PII is stored, in what format(s) will the PII be stored? For example, will PII be stored electronically, on paper/hard copy, internal network drive, external network drive, database, removable device, email, cloud, and/or any other format(s)?
G4 If any PII is stored as part of this project, what is the retention period for any stored PII, and what is the justification for that retention period?
G5 How will PII be disposed when no longer needed for the purpose for which it was originally collected? For example, will such data be deleted, destroyed, returned to the contracting public agency or department, anonymized, de-identified, or handled in another manner?
G6 How will your retention period be applied to PII collected and stored as part of this project? For example, will it be implemented using technical solutions, automatically, manually, with the aid of a subcontractor, etc.?
G7 How do you enforce data standards?
G8 Will cloud computing be used to store or process information related to this project? If so, please identify the type of cloud service.
G9 If third parties (e.g., subcontractors or other partners) will be accessing PII collected in connection with the project or contract, do you have agreements in place with such third parties to ensure that any shared PII will be maintained safe, secure, confidential, and private?

 

2. APPENDIX B: GOVERNMENT ALLIES FOR SMART PROCUREMENT

A.APPENDIX B: GOVERNMENT ALLIES FOR SMART PROCUREMENT

A.

GOVERNMENT ALLIES

This Appendix identifies possible allies that can help inform a government agency’s decision-making and risk assessments during the procurement process. It also offers insight into the role or function of allies and ways in which working collaboratively with them can help inform smart procurement strategies. Although these allies may not naturally be aligned in terms of privacy protection, this Appendix, in conjunction with other tools provided in this Guidance, can help identify other organizations, departments, or agencies to engage with. Given that City, County and State governance structures can differ, please keep in mind that similar positions may be housed in a different department or hold a different title depending on the jurisdiction.

PROCUREMENT OFFICE

Who:

Procurement Officer, Procurement Team
What: While every office and government system may be structured differently, the procurement office may be tasked with governing the process for purchasing new technology. Traditionally, this office may work independently from government agencies that are directly seeking technology or serve an advisory role to the process.
Tip: As a government entity that may utilize technology, consult relevant procurement professionals before engaging in the procurement process.
How: Working collaboratively from the outset of the procurement process can help identify community needs, learn about potential vendors, anticipate and flesh out necessary contract provisions, all of which will help inform the government agency’s assessment of risk and strengthen the government’s bargaining position during negotiations with a vendor.
  • By working together, the agency seeking new technology can identify why it is seeking specific technology and how that technology helps achieve those goals.
  • The procurement office can help stakeholders understand technical requirements can effectively be mapped in RFPs, memorialized in contract language, and reviewed once performance of the contract begins.
  • This collaboration can increase the likelihood that necessary contract provisions are included, inspections are completed, and the best deal is negotiated with a vendor.

TECHNOLOGY OFFICE

Who:

Technology Officer, Information Officer, Chief Technologist, Technology Team(s)
What: Technology professionals will have the most technical knowledge, and thus often will be helpful allies when considering new technology procurement initiatives. These professionals also may be tasked with assessing the contemplated technology from a more detailed technical perspective, and may also play a role in implementing and auditing the technology as well.
Tip: Utilize the knowledge base of this office to better understand the technology, how it works, identify possible privacy and security concerns, and identify training or additional resources that may be necessary to effectively implement the technology.
How: Work with the technology office to break down the technology in an easy-to-understand way that can be understood by anyone, regardless of how much technical knowledge they have.
  • Working with the technology team will facilitate better understanding of the technology, how it works, and its possible implications.
  • Begin communicating with the technology office when you first consider technology to better understand the bandwidth of their staff to help with the technology.
  • Government agencies should avoid overly relying on marketing and other vendor materials designed to sell a tech product or service, and should not hesitate to rely on internal tech team to strengthen their understanding of potential issues raised by contemplated technology.
  • By facilitating open communication about what resources are already available (such as personnel and support equipment needed to implement the new technology) and what are still needed, the government agency will be able to better prepare for the negotiation process with the vendor and adopt a more cost-effective approach.

INDEPENDENT OVERSIGHT OFFICE

Who:

Inspector General, Internal Auditors
What: Departments and individuals responsible for assessing an agency’s operations by auditing or investigating their activities. As such, the role often includes assessing the use of technology and holding the government agency accountable to only use the technology for the defined purpose and ensure that it is working effectively.
Tip: Connect with this office early to better understand the assessment structure for the technology that you are seeking and make sure it is in compliance on the front-end to help prepare for and pass routine audits.
How: Work together with the oversight offices to understand the bandwidth and structure of the office to conduct routine audits and risk assessments of the technology. This includes recognizing their capacity to conduct any assessments, the structure for those assessments, what materials may need to be shared, and other components that can help make the assessment both comprehensive and effective.
  • By working collaboratively before the technology is adopted, you can plan for ways to increase capacity of the oversight office or the government agency to ensure that the assessments are properly completed.
  • Early communication can also help both the oversight office and the agency seeking the technology develop a plan to work with existing resources in an efficient way when introducing and implementing the technology.

Law Enforcement

Who: Local Police Departments (municipal, state, etc.), Police Chiefs, Parole Board
What: Law enforcement agencies often acquire technologies at a rapid pace to address security and safety concerns. These technologies can have privacy and civil liberties implications.
Tip: Develop relationships and open channels of communication with law enforcement when seeking technology to learn from their experiences and establish robust privacy-protective measures.
How: Communicate and consult with law enforcement agencies, and vice versa, to learn lessons from each other about technology that has been used. Also communicate with each other data-sharing limitations that may exist and ensure that additional resources are only spent on technology that is providing value to the public while remaining privacy-protective.
  • An example of information that will need to be protected by firewall is health information and other information that is only accessible by a court-ordered warrant. This information should not be inadvertently shared with law enforcement through technology that is implemented by another government agency.
Tip: Adopt processes that require greater transparency, public oversight, and accountability measures.
How: When engaging with law enforcement agencies to procure technology, it is critical for government agencies to be transparent with the public about what new technology is being sought and how it is intended to be used. This is especially important when the technology has the ability to implicate privacy rights, civil rights, and civil liberties.
  • When working with law enforcement agencies, provide the public with meaningful ways to assess the technology, ask questions, and share their concerns.
  • Incorporate public insight before acquiring the technology.

CONSUMER PROTECTION

Who:

Consumer Protection Agency
What: Consumer protection offices are responsible for stopping unfair, deceptive, and fraudulent business practices by collecting complaints and conducting investigations.
Tip: Build connections with consumer protection offices before new technology is acquired to develop safeguards that can be adopted during the procurement and implementation of technology.
How: Consult with city or state consumer protection offices to help identify safeguards such as: how to investigate deceptive business practices, how can the public file complaints associated with a technology, what mechanisms should be in place to make sure all complaints are addressed, and any other elements that can help protect consumers during the roll-out and use of technology.

RECORDS OFFICE

Who:

Freedom of Information Act (FOIA) Officers, Clerks, Comptrollers
What: Records officers have access to a number of public records and fulfil related requests. Not only do such offices provide the public with documents, but they are critical to ensuring transparency and government accountability.
Tip: Develop strong relationships with record officers in different departments and agencies to create a cooperative process that ensures transparency.
How: Reach out to record offices early in the procurement process to build a cooperative relationship that encourages greater transparency when fulfilling requests that are filed by media, organizations or the public.
  • It is also essential to keep the channels of communication between agencies and records office open so that transparency elements such as open source codes were negotiated during the procurement process, the FOIA Officer knows that they are able to share this information and it is not proprietary or otherwise protected.

PUBLIC HEALTH

Who:

Department of Public Health, Disease Intervention Specialist, Health Protection
What: Public health experts are often directly involved with identifying public health needs and developing tools to address those needs.
Tip: Work collaboratively with public health experts so that they can inform the process to address the public health needs of the community.
How: Engage in conversations with various public health experts to make sure that all aspects of public health are accounted for and considered when thinking about a technological tool.
  • Public health departments may be the best situated to identify the needs of the community and how possible technology can help achieve those needs. They can also provide a greater understanding of the existing barriers that may impede the effectiveness of the technology and identify others to work with to implement and audit the use of the technology.

3. GLOSSARY OF TERMS

A.GLOSSARY OF TERMS

A.

“Algorithmic bias” describes systematic and repeatable errors in any computer system that can create unfair outcomes, such as arbitrarily privileging or disadvantaging one individual or group over others. This includes instances when the application of an algorithm compounds existing inequities in socioeconomic status, race, ethnic background, religion, gender identity, disability or sexual orientation to amplify them and adversely impact inequities in systems.

“Artificial intelligence” (“AI”) is a type of technology that simulates human intelligence processes by machines, especially computer systems. AI also includes the use or emulation of human reasoning in computer systems, such as features like natural language processing (NLP), speech recognition and machine vision. Many AI systems make use of machine learning processes, which are applications that provide systems the ability to automatically learn and improve from experience without being explicitly programmed.

“Automated decision-making” is the process of making a decision by automated means without any human involvement. Such decisions can be based on factual data, as well as on digitally created profiles or inferred data.

“Biometric identifier” refers to any piece(s) of information derived from any unique biological feature or human characteristic that can be used, directly or indirectly, to identify a specific individual. Examples of biometric identifiers include retina or iris scan, fingerprint, handprint, voiceprint, facial geometry, dental imprint, and DNA. Common characteristics such as height, weight, hair color, or eye color, are not biometric identifiers.

“Black box system” refers to any system whose inputs and operations are not visible to the user or explainable. Such systems yield outputs that cannot be reverse engineered to understand how the system arrived at a given result or outcome. This is problematic not only for lack of transparency, but also for possible biases inherited by the algorithms from human prejudices and collection artifacts hidden in training data used to facilitate machine learning, which may in turn lead to unfair, incorrect, or improper decisions. Many AI applications utilize black box systems that can stymie efforts at transparency without intervention.

“Community” means any group of people living in the same general area or who have a particular characteristic in common. As used herein, “community” is inclusive of location (e.g., neighborhood, suburb, town, or city), identity (e.g., age, race, ethnicity, gender, religion, culture, income, ability), and organizational affiliation (e.g., political, professional, familial).

“Data anonymization” or “de-identification” are processes of irrevocably occluding or removing personally identifiable information from data sets so that the people from whom data was originally collected can no longer be identified using that data.

“Data breach” means any unauthorized access to or acquisition of personal information, which may occur when the security, confidentiality, or integrity of a data system on which personal information is stored is compromised.
 
“Data governance” refers to the overall strategy (including policies, practices, and measures) to ensure that information (particularly personal information) is managed in accordance with applicable legal, ethical, and other internal rules and requirements. Data retention, or how long information is retained before it is destroyed, is one critical area of data governance.

“Explainability” is the extent to which the results of a given operation can be explained and understood (in human terms) as a function of the operation itself. More simply, it is the degree to which the question “Why is this happening?” can be explained and understood. Explainability is a term sometimes mistakenly used interchangeably with “interpretability,” which refers to the extent to which one is able to predict what is going to happen, given a change in input or algorithmic parameters.

“Fairness,” in the context of privacy, means that information should be processed in ways that people would reasonably expect, justly and without alteration, and not in ways that have unjust or unjustified effects.

“Geolocation data” means technologically derived information capable of determining with reasonable specificity the past or present actual physical location of an individual.

“Government-held data” refers to the universe of information in the possession, custody, or control of the government. This information may be gathered from sensors, or from more direct interactions with citizens, members of the community, and other individuals within a given locality, such as tax assessments, employee-reported damage to recreational spaces or equipment, unemployment claims, and so on. This can include historical records that may not reflect the current demographic makeup of the locality or that may be biased by historical segregation or structural racism. Relying on this data without identifying and correcting for its weaknesses may lead to inequitable treatment of communities of color and other marginalized groups.

“Government procurement stakeholders” means policymakers, procurement and contracting professionals, and all other personnel employed or working on behalf of the government in connection with the procurement of technology.

“Metadata” is information that describes and gives information about other data. Many devices and sensors embed metadata into any content they collect or create. For example, a captured image may be tagged with the time and location where it was taken, or an online form could record information about the submitter’s device and Internet connection. This information can be highly identifiable and sensitive, both independently and in the aggregate, even when the “content” itself is not.

“Proximity data” means technologically derived information that identifies the past or present proximity of one individual to another person.

“Personally identifiable information” (“PII,” sometimes shortened to “personal information” or “personal data”) means any representation of information (in physical, electronic, or other media) that permits the identity of a specific individual to whom the information applies to be reasonably inferred by either direct or indirect means. PII can be information that directly identifies an individual (e.g., name, address, social security number or other identifying number or code, telephone number, email address, etc.), or information that, when combined with other available data elements, can be used to make an indirect identification (e.g., a combination of gender, race, birth date, geographic indicator, and/or other descriptors). In addition, information permitting the physical or online contacting of a specific individual is the same as personally identifiable information.

“Request for proposal” (“RFP”), sometimes called an “RFI” (or “Request for Information”), is a document that announces and provides details about a project, and also solicits bids from vendors who may be able to help complete the project. An RFP typically outlines the needs of the project, applicable requirements and limitations, and often details other information. For simple procurement (e.g., purchase of fungible products), requests for quotations (“RFQs”) are often used to gather pricing information from potential vendors.

“Sensor data” is information derived from computerized sensors, or devices that record information about people and/or the environments in which they are placed. As sensors vary greatly both qualitatively and quantitatively, sensor data can include or relate to many types of information, such as video, audio, photographic (e.g., traffic light cameras), geolocation, movement, proximity, environmental (e.g., weight, temperature, air quality, humidity, illumination), consumption (e.g., use of electricity or other utilities), and many more. Some sensors can provide very high-quality readings or recordings, some are paired locally with analytical hardware, some are visible while others are hidden from view.

“Technology” refers to any electronic, digital, or other technological tool, method, process, or service that can be used to perform any desired task(s), including the collection, processing, or engaging in other use(s) of information. Technology also encompasses analytical techniques and algorithms used to help make classifications, predictions, or decisions, including those that could impact individuals or communities. Some technology procured by the government will have nominal relevance to privacy and civil liberties concerns (e.g., acquiring more energy efficient light bulbs for traffic lights), while many others may cause serious and grave concerns (e.g., surveillance tools, use of facial recognition, AI, stingrays, etc.). Examples of technology commonly sought by various government agencies include mobile tracking devices, software that collects user data, large- scale and integrated databases of personal information, automated decision systems, AI, integrated surveillance systems, and biometric systems. As used herein, the phrase “public technology” refers to any technology procured or used by the government in a manner that may impact the public, whether directly or indirectly.

“Third-party data use” or “downstream data use” mean any use or sharing of information (often PII) not expressly contemplated at the time the data was initially collected or provided. Third-party data use also refers to situations where a party that rightfully has access to data provides that data to another separate party.

“Transparency” is fundamentally linked to fairness, accountability, and explainability, and in the context of privacy, means being clear, open and honest about how PII is collected, stored, managed, and used. Transparency ensures no secret data collection occurs, provides information about the nature of collection of PII and how the PII is used. Transparency requires that notice be provided to individuals about PII collection in easily accessible and understandable language

“Use case” is any specific situation or set of circumstances in which a technology could potentially be used.
 
“User-Supplied Data” refers to information provided by users of a system, service, or application, either manually (e.g. by filling out an online form or sending a text) or automatically (e.g. by downloading a mobile app that uses a smartphone’s sensors to warn about potholes on the road while driving). This data often comes at low cost, but may over- or under-represent certain communities based on historical relationships with the government, economic inequality, or other factors that can create bias or outright error in the data set. It can also raise privacy concerns if, for example, the mobile app collects data even when not in use.

"Vendor risk management (“VRM”)" is part of the process of conducting vendor due diligence, i.e., vetting, to identify, understand, manage, quantify, and mitigate risks associated with the vendor relationship, operative data set, and underlying contract(s).