Behavior Engineering and Technology

The telescreen received and transmitted simultaneously. Any sound that Winston made, above the level or a whisper, would be picked up by it; moreover, so long as he remained within the field of vision which the metal plaque commanded, he could be seen as well as heard. There was of course no way of knowing whether you were being watched at any given moment. How often, or on what system, the Thought Police plugged in on any individual wire was guesswork. It was even conceivable that they watched everybody all the time. You had to live … in the assumption that every sound you made was overheard, and, … every movement scrutinized.”

~1948, Eric Blair (George Orwell), 1984


This is the first of a multi-part series in which we break down the implications of technology based geospatial intelligence, beginning with analysis on pending Google patents and how these technologies provide a framework for Orwell’s grim negative-utopia, 1984. We will take a deep dive into the current and future states of intelligent machine technology such as the merger between artificial intelligence and quantum computing capabilities, SMART technologies, the coming Internet of Things: A multitude of virtually connected devices, utilities, and people, networked to cloud-based computing systems who distribute control. As we analyze patents, business strategies, research journals, and historical writings, a grand program comes into view: a hierarchy of intelligent machines and Quantum Computing forecasting designed for global control over human consciousness. Our introduction begins with a brief synopsis of one of many organizations actively piecing together a control grid of technology through strategic acquisitions and innovations in deceptive service offerings.

Filed on December 17, 2014, Google seeks a patent on Smart-Home interface systems (US Patent Application 20150371422), which primarily functions as a control mechanism over entire Smart-Home networks and works in conjunction with SMART grid technology. Utilizing intelligent algorithms to calculate “progress towards one or more control goals in order to achieve control that satisfies potentially conflicting goals,” the control-hub is designed to achieve a “balance” between occupant needs and government policy, You may ask yourself, what constitutes a conflicting goal? – YOU are the conflicting goal! Google blatantly points out that “smart [technologies] enable two-way communication between the [device] and the utility company…,” meaning that your needs are always secondary and authorities enforce conformity. Google divulges their continuous monitoring and intelligence gathering schemes to record deviations between users habits and usage allowances; daily habits logged while external environmental directives dictate permitted usage.

The implication of mass geospatial intelligence, the gathering of human data points at the micro-level, combined with quantum artificial intelligence capabilities will lead to staggering precision with respect to forecasting and steering the future. Bureaucracies seek to hold direct control over a persons day-to-day activities with continuous real-time monitoring, fines, criminal penalties, and financial distinctiveness designed to create a chilling effect – a planetary system for behavior modification.

Overcoming the problem of rural locations, US Patent Application 20150371422 details the use of information transmission using data-over-power where modern network infrastructure is unavailable. Data-over-power enables broadband communication over power lines. The patent also refers to a conglomerate of SMART systems detecting signals sent by “sensors embedded within the controlled entity.” The artfully vague terms in this section are intentionally left generic hinting at the grand scope of what entities may embody future technology systems. Consider the phrase “controlled entity” along side emerging wearable technologies. Not only does wearable technology act as a tracking sensor within the smart-home, also records user vitals such as heart-rate, breath, and pulse. The patent touts knowing when home occupants are sleeping and when they are away. Think, just how will the system make this determination?

Other integrating smart devices mentioned in patents such as US Patent Application 20150109104: wall-plugs, lighting, security systems, and intercom systems (all of which host mechanisms for continuous video and audio data recording from every corner of the home — Google Patent Figure 1). Keep in mind that smart wall-plugs log every action on every device used; smart intercoms log every word spoken (Full scope explained by CIA Chief Petraeus), and smart lighting systems directly interface with the human domain through the utilization of data-over-light, a method of utilizing oscillating electromagnetic pulses to transmit information in a covertly. (Terabit Free-space Data Transmission)

Data collection includes “information obtained through the Internet, various remote information sources, and even remote sensor, audio, and video feeds…”, meaning every action, including duration and time of day will be logged and transmitted for inspection (Google Patent Figure 2 – Data Sharing); the population studied like lab rats by the next generation of technological elite; the global brain achieving god-like omnipresence. Time interval and real time consumption statistics will form a multitude of personal data-points, from what happens your bathroom, to what happens you bedroom. All of your most personal moments handed over “in part or in whole, to various remote systems and organizations, including charities, governments, academic institutions, businesses, and utilities”

In addition to compliance metrics, data will be used for predatory advertising geared to exploit the private moments of every participant, witting or not:

external entities may collect, process, and expose information collected by smart-home devices within a smart-home environment, may process the information to produce various types of derived results which may be communicated to, and shared with, other remote entities, and may participate in monitoring and control of smart-home devices within the smart-home environment as well as monitoring and control of the smart-home environment.”

The selling point: safety, conservation tax credits, home-owners insurance discounts, and geek culture (the StarTrek generation). Perhaps as an appeal to science fiction fans or a result of predictive programming, Google’s SMART thermostat interface eerily resembles HAL 9000 from Arthur C. Clark’s Space Odyssey, equipped with an all-seeing eye covered by smoked or mirrored glass. –As to not disturb home occupants; logging every move, studying breath patterns, identifying habits, flagging fluctuations in heart-rate, controlling everything in the home down to the windows and doors…

Open the front door HAL!”

(soft, non-threatening voice) “I’m sorry Dave, I’m afraid I can’t do that.”


As each day passes Google silently constructs a Legion [for we are many] of data points known as the Internet of Things managed by the awesome power of Quantum AI, intelligent machines whose sole purpose is control; control over entire networks of smart-homes right down to the wall socket; control over humanity; control over individual behavior. These systems receive and transmit simultaneously via smart-metering systems logging “information obtained through the Internet, various remote information sources, and even remote sensor, audio, and video feeds…”. Taking a deeper dive into technologies making up the Machine, we introduce our quest to understand how each component ties into a grand program: a psychocivilized society and total physical control of mind and reality.

Filed on August 6, 2010, Google seeks a patent on a speech recognition system (US Patent Application 20130091071) designed for real-time auditory monitoring of the population. In classic Google fashion, consumers are lured in by gadgetry and flashing lights. According to Google’s own words, they plan to get their eyes and ears into your home, car, and office by marketing voice capable computing as convenient and safe, touting smart-interface integration and social networking capabilities. Google effectively changes the paradigm of voice activated devices by taking end-user control out of the equation, handing it over to computer overlords. This technology doesn’t actually solve any problems unless users are truly inconvenienced by having to press an ‘ON’ button (or some other physical queue) to initiate voice input mode. The patent makes a patronizing statement, actually referring to using an “ON” button as “formalities”– after-all, we cannot be inconvenienced by such decisions.

While the user’s visibility is limited to the smart device interface, on the back-end exists an invisible network of intelligent computing systems with decision making power at work. Intelligence gathering servers labeled ‘Context Units’ track the environment surrounding the phone employing “satellite-based positioning techniques, base station transmitting antenna identification, multiple base station triangulation, Internet access point IP location determinations, inferential identification of a user’s position based on search engine queries, and user-supplied identification of location”. Interestingly enough, Google servers interact with neighboring devices, activating microphones as you near proximity. For example, smart-intercom systems within your home initiate as you arrive home from work, hidden microphones in smart-streetlights or in retail stores activate as you walk by. Combined with data harvesting, decision making intelligent machines calculate [threat] scores and weights based on physical motion, personal data, background noise, camera images, social network activity, emails, telephone meta-data, calendar information, and text messages, to determine whether microphones are enabled, listening in when “monitoring for voice input [is] convenient for the mobile computing device.” The system is also comes equipped with a handy-dandy User Behavior Data Repository, logging all selections including user requests to disable monitoring and what the user is doing when the device was prompted to disable microphones. 

Peppered throughout the patent language, Google claims it will offer users the option to disable voice monitoring, but a recent Google Chromium snafu suggests otherwise. Google carefully packaged malware embedded in Chromium code that captured and transmitted voice data back to the conglomerate, without user knowledge or providing an option to disable and causing an uproar in Linux and Unix communities. Bottom line, computers will determine “whether to switch to the second mode of operation, and activating the microphones and the speech analysis subsystem is performed without direction from a user.

Smart-home wardens, wiretapping under the guise of convenience, the end of privacy, this is just the beginning – welcome to the machine.

As we shift our journey from the intricacies of Mind and human consciousness, we end our introduction into technology with an excerpt from Jose Delgado’s 1969 publication in World Perspectives volume 41 titled Physical Control of the Mind; Toward a Psychocivilized Society:

“Civilized man has surrounded himself with a multitude of instruments which magnify his senses, skills, strengths, and the speed with which he can travel, without realizing, perhaps that in his drive to be free from natural elements, he was creating a new kind of servitude dominated by levers, engines, currency, and computers.

Because of the magnitude of our material and intellectual powers, the directive resolutions made by elite groups may be decisive for the development of scientific and economic fields of endeavor, for the evolution of civilization in general, and for the very existence of man.

The “balance of terror” existing in the present world reflects the discrepancy between the awesome technology and the underdeveloped wisdom of man.

While our mental faculties are incomparably superior to those early land animals, we still lack adequate self-knowledge and control, and natural history teaches that when under developed brains are in charge of great power, the result is extinction.

The danger that the entire culture may become technological is obvious, and the divorce of mind from life which is taking place catastrophically everywhere in the modern world has evoked strong intellectual reactions and is one of the central themes of extential philosophy.”


Phase 2 of Chicago Array of Things (AoT) is scheduled for deployment in September 2016. Beginning in August 2016, 46 surveillance sensors will be installed down the coast of Lake Michigan, and throughout downtown. By 2018, over 500 data tracking points will be peppered throughout the Chicago metropolitan area. An elaborate data collection system converting the inner city public into laboratory for behavior mapping and pattern recognition.

According to the Chicago AoT Privacy Policy the type of data collected will include climate, air quality, public space usage, electronic device identifiers, license plate location and identification, audio recordings, images, personal characteristics, voice signature, facial geometry, and other biometrics. They will map behavior patterns, individual habits, and collect information that link to religion, activities, geographical location, medical information, education information, financial information. Chicago AoT Data Tracking Locations

Operationally, the program is run by Urban Center for Computational Data (UCCD) in partnership with the City of Chicago and SMART Chicago Collaborative. Research institutions with primary access to data include the Computation Institute at the University of Chicago and Argonne National Laboratory, a research arm of the United States Department of Energy.

The stated purpose of the AoT program “is an urban sensing project, a network of interactive, modular sensor boxes that will be installed around Chicago to collect real-time data on the city’s environment, infrastructure, and activity for research and public use.” While there are considerations for public use, there was no public buy-in, no vote, no consensus, and laughable marketing and public outreach. The Array of Things (Aot) Civic Engagement Report touts transparency and public engagement, however citizens of Chicago were allowed ONLY comment through online forms and TWO public meetings: The first meeting held at 5:30PM on Tuesday June 14, 2016 at Lonzano Library – only 40 people attended. The second meeting held at 5:30PM on Wednesday June 22, 2016 at Harold Washington Library – again only 40 people attended. One wonders why meetings were held on weekdays and at such a difficult hour for the average person to attend… and how such an obvious failure in marketing and engagement could be touted pubic acceptance — only 40 people? Really?

SMART Chicago Collaborative’s outreach goals made it clear…

  • Educate Chicagoans about the Array of Things project, process, the potential of the research, and the sensors’ capacities
  • Inform future generations of the Array of Things sensors
  • Understand what the people want out of the Internet of Things & these neighborhood data
  • Collect resident feedback on privacy and governance policies for Array of Things

… public meetings presupposed program acceptance, which was not on the discussion board — Chicago citizens were denied the option to say no; a classic neuro-linguistic programing and Delphi technique. People attend the meetings under the guise of involvement when none exists. Meetings are strictly for gauging public pushback at an insignificant statistical level (80 people representing 2.7 million).


Perhaps diving into privacy policy language will shed some light on the unusual strategy for transparency and public engagement.

“All operational sensor data will be publicly available as open data, owned by the University of Chicago.”

On the policy responses page for AoT, the program confirms harvested data (electronic device identifiers, license plate location and identification, audio recordings, images, personal characteristics, voice signature, facial geometry, and other biometrics) is owned by the University of Chicago, AND they will also hold copyright! Recordings of private conversations, images of individuals, databases of pubic information owned by copyright. This is not a public system with public oversight — no Freedom of Information requests, no redress of grievances, no recourse — this is a private system with enough public access to quiet the masses through the illusion of participation.

Secondly, once the vast amount of behavioral and pattern data is publicized, it can be used for ANYTHING — no controls exist on who or how data may be used.

“In order to support economic development, data from approved experimental sensors, installed for specific research and development purposes, may be withheld from (or aggregated for) publication for a period of time in order to protect intellectual property, ensure privacy or data accuracy, and enable the proper calibration of the sensor.”

One wonders what exactly “experimental sensors” translates to, specifically what functionality, specifically what additional data will be harvested, specifically whom will be performing the experiments, specifically whether extent experimental studies will be made public, and will the public be notified prior to experimentation?

“The privacy policy sets forth how the operators of the Array of Things program will collect and manage data, some of which may include personal information or Personally Identifiable Information (PII). – PII is name, ID numbers, email address, home address, personal characteristics, photo’s of identifying characteristics, fingerprints, handwriting, retina scans, voice signature, facial geometry, other biometrics, and information linkable to date of birth, place of birth, race, religion, weight, activities, geographical indicators, medical information, education information, financial information.”

How exactly will the system detect PII in images and audio files and how will this data be scrubbed before releasing information to the public? And secondly the published privacy policy and policy responses fail to provide public citizens a mechanism for notifying when an unintended PII release has occurred. As it stands now, there is no process for correcting personal information, no way for removing personal information, and no way of knowing exactly what information is collected. The pubic intentionally left in the dark program capabilities now and in the future.

“PII data, such as could be found in images or sounds, will not be made public”

This line explicitly states public imagery and audio will be captured, then withheld from the public. However when we spoke to privacy activists and big data experts at various universities, they warn that Array of Things will in fact publicize PII data according to their meetings with AoT project leaders and to expect another privacy policy revision. Does public audio recording comply with Illinois wiretapping law?

According to Illinois code:

A person commits eavesdropping when he or she knowingly and intentionally:

(1) Uses an eavesdropping device, in a surreptitious manner, for the purpose of overhearing, transmitting, or recording all or any part of any private conversation to which he or she is not a party unless he or she does so with the consent of all of the parties to the private conversation;

(2) Uses an eavesdropping device, in a surreptitious manner, for the purpose of transmitting or recording all or any part of any private conversation to which he or she is a party unless he or she does so with the consent of all other parties to the private conversation;

(3) Intercepts, records, or transcribes, in surreptitious manner, any private electronic communication to which he or she is not a party unless he or she does so with the consent of all parties to the private electronic communication;

(4) Manufactures, assembles, distributes, or possesses any electronic, mechanical, eavesdropping, or other device knowing that or having reason to know that the design of the device renders it primarily useful for the purpose of the surreptitious overhearing, transmitting, or recording of private conversations or the interception, or transcription of private electronic communications and the intended or actual use of the device is contrary to the provisions of this Article; or

(5) Uses or discloses any information which he or she knows or reasonably should know was obtained from a private conversation or private electronic communication in violation of this Article, unless he or she does so with the consent of all of the parties.

Considering the law, an obvious conflict exists begging the question: how exactly will pedestrians be given notice that they are within range of data collection? (Perhaps there are plans to erect a holographic transmitter that will beam each passer-by a copy of the privacy policy, updates, notices in clear language with a consent form that can be signed by digital thumb print.)

“For the purposes of instrument calibration, testing, and software enhancement, images and audio files that may contain PII will be periodically processed to improve, develop, and enhance algorithms that could detect and report on conditions such as street flooding, car/bicycle traffic, storm conditions, or poor visibility.”

The language describing how PII will be processed is left intentionally vague with no indication as to what system capabilities actually are and why specifically PII will be used as opposed to non-PII. This allows program operators to widen the scope of data harvesting and utilization without policy modification.

“Raw calibration data that could contain PII will be stored in a secure facility for processing during the course of the Array of Things project, including for purposes of improving the technology to protect PII. Access to this limited volume of data is restricted to operator employees, contractors and approved scientific partners who need to process the data for instrument design and calibration purposes, and who are subject to strict contractual confidentiality obligations and will be subject to discipline and/or termination if they fail to meet these obligations.”

This section implies no retention policy exists for PII; data is stored and used for the life of the project. — How long is that? If images and audio are stored indefinitely, any individual can be identified based on habits and clothing, common routes can be used to determine location of home, work, family, and friends.


Interestingly enough, the online forum provided by AoT for public comment/questions buried the most articulate questions without addressing in policy responses page for AoT.

From the Symposium on Usable Privacy and Security (SOUPS) members including: Lorrie Faith Cranor, Carnegie Mellon University, Alain Forget, Google Patrick Gage Kelley, University of New Mexico Jen King, UC Berkeley Sameer Patil, New York University Indiana University Florian Schaub, Carnegie Mellon University / University of Michigan Richmond Wong, UC Berkeley:

At the Symposium on Usable Privacy and Security 2016, held last week (June 22-24, 2016) in Denver, Colorado, a group of privacy and security researchers looked at the Array of Things project and its current documentation. The short report below is a compilation of their feedback. Overall, we appreciated the thought and care given to privacy and security throughout the proposed documents and the Array of Things project. Having a period of public comment, an open and thoughtful process for selecting new node locations, and an AoT Security and Privacy group are steps that lead to practical privacy for the people of Chicago.

That said, we have comments on a few areas of the document that we hope you will consider. PII in the open data set In the privacy policy, you say “PII data, such as could be found in images or sounds, will not be made public.” What is the process for deciding what is PII and removing it? Removing all PII from this data set may actually be fairly difficult and error prone, and there may be a lot of PII, especially if video captures faces or license plate numbers. You should determine what will be involved in doing this and perhaps revise the language in the privacy policy to set more realistic expectations.

Is there a way for people who believe their PII has been shared to have it removed? Currently there is no contact information in the Privacy Policy, and thus no way for people to remove or correct information they believe is inaccurate or wrongly shared.

If sound recordings are going to be made, it is important to make sure this is in compliance with the Illinois wiretapping law. Notice The current policy document has no specifics on how notice will be provided to residents of node areas or visitors who happen to drive or walk through the range of a node. We believe significant thought needs to be given to how to notify people that they are in area/range of a node and their data is being collected. This will also allow them to find out what choices they have in removing their PII or other data from an open repository.

We hope that consideration will be given to notice, including: What languages will the information be presented in? What technologies will be used (e.g., a sign, a short link, a QR code, some sort of mobile notification scheme, an app to show which streets are covered by these nodes) The format and display of the information itself (e.g., a street sign, at what height, using what set of color schemes or logos that relates to the project) Is there any effort made to allow people with low-literacy rates or vision-impairment to have access to this material? How updates to the project’s policies and notices can be communicated to people who walk or drive through the range of a node A plain language (non-legalese) version of the privacy policy should be made accessible to the public Notices should include contact information for the Privacy Officer or similar role responsible for managing privacy issues on the project Data Use / Purpose In most privacy policies, it is important to explain what collected data will be used for.

While much of the data collected as part of this project will be made public (through the open data repository) and then can be used for nearly anything, it is still important to explain potential data use to participants. This should include, at least: A description of how each data type collected will be anonymized and aggregated. Specific examples that show how each data type could potentially be used. What sorts and format (i.e., aggregated versus specific data items) of data the annual report will include. Consideration of establishing a use policy for the open data set, or setting up guidelines for how to respond in the event that open AoT data is used by other parties for malicious or discriminatory purposes. Notice regarding whether the data will be used by law enforcement for any purpose.

Annual Report While it is commendable that the AoT group has declared that the policy will be reviewed annually, we would recommend that the review include more specification (What sources of data will be reviewed? How can the community participate? Will this include potential breaches, violations of policy, and/or public complaints?), as well as address the need for evaluation, specifically: is the project meeting its stated goals? Who will review the project for compliance with its stated policies, and how will this review be conducted? How will the annual report be distributed to the public?

Small edits to the language “Collection may include but is not limited to” or “other biometric data” are phrases that should be avoided. While they may be standard legalese for privacy policies, given your project’s spirit and values, we recommend that you strive for openness and transparency. You should do your best to explicitly describe all data collected and the purpose of collecting them. If more types of data are collected in the future, then the descriptions and explanations should be updated.

From the Future of Privacy Forum (FPF), a think tank seeking to advance responsible data practices and is supported by leaders in business, academia, and consumer advocacy:

We would like to thank the Array of Things (AoT) project for this opportunity to provide feedback on the proposed Governance and Privacy Policies, and to engage with the broader Chicago and smart city communities. We applaud the AoT’s commitment to building a transparent and responsive program. While this initial privacy policy proposal provides a useful starting point, we urge the AoT’s Security and Privacy Group and Executive Oversight Council to expand or revise it in several ways to better achieve its goals of balancing privacy, transparency, and openness.

1. The Privacy Policy should reflect a FIPs-based framework. The Fair Information Principles (FIPs) are “the framework for most modern privacy laws around the world” and NIST recommends that in order to “establish a comprehensive privacy program that addresses the range of privacy issues that organizations face, organizations should take steps to establish policies and procedures that address all of the Fair Information Practices” ( The current AoT Privacy Policy addresses some, but not all, of these principles.

  • In a more robust FIPs-based Privacy Policy, we would also expect to see meaningful details regarding: – What rights or mechanisms, if any, individuals might have to access, correct, or request the deletion of their PII? – What mechanisms, if any, provide individuals with redress regarding the use of their PII? – In addition to discipline and confidentiality promises, what accountability controls (such as employee training, vendor audits, or data use agreements) will help ensure employees, contractors, and approved partners with access to PII comply with the privacy policy.
  • How long will PII be retained, how PII will be disposed of after it is no longer reasonably necessary for the purposes for which it was collected, and how PII will be treated if the AoT program dissolves or transfers ownership. – How and when PII will be deleted or de-identified.
  • How the program operators will respond to requests from local, state, or federal civil or law enforcement agencies to access PII (such as when presented with a warrant or subpoena) and to what extent PII is subject to Freedom of Information Act disclosure requests.
  • Information on how to contact AoT officials regarding any privacy or data security breaches.
  • How will PII be secured through appropriate administrative, technical, and physical safeguards (such as encryption at rest and in transit, local processing or storage, etc.) against a variety of risks, such as data loss, unauthorized access or use, destruction, modification, or unintended or inappropriate disclosure.
  • What mechanisms, if any, are available for individuals to exercise control or choice over the collection of PII (e.g., could individuals turn off their phones or participate in an opt out to avoid certain kinds of tracking?)
  • How the AoT minimizes the collection of PII. Importantly, given the significant amount of information that residents of and visitors to Chicago might be expected to digest, a layered privacy notice highlighting key points would be appropriate. Additional notifications, such as public signage on or around AoT nodes or just-in-time mobile notices pointing to the full privacy policy might also help provide meaningful notice.

2. More meaningful technical details within the Privacy Policy would improve trust and transparency for the wide array of stakeholders interested in assessing the program’s privacy and security promises and practices. The AoT’s Privacy Policy is relevant not just to the citizens and communities of Chicago but also a wide range of civil society organizations; other local, state, and federal government officials; academics; potential vendors or research partners; technologists and privacy professionals; and the media

  • In privacy nomenclature, describing data as PII typically means that the data can be linked to an identifiable individual, whereas considering data “sensitive” typically signals that the data will be treated to a higher standard of privacy protection. In order to avoid confusion, we suggest clarifying these terms. – When audio or image files may contain PII, what specific kind of PII is collected. There is a stark difference in privacy impact between software used to simply detect faces (facial detection) and software capable of identifying individuals in photos via biometric templates (facial recognition).
  • A similar distinction is made between speech detection and speech recognition capabilities. Given the general public unease about loss of anonymity and privacy in public spaces, it is key to clarify what technologies are being used in this context and what capabilities they have for processing PII. This will help allay fears regarding the use of PII from image and audio files captured in public spaces.
  • How the AoT will ensure adequate de-identification for data made public through the City’s data portal. Open data enables important scientific research and urban innovation. Given the AoT’s intent to make its data available freely, it must implement the strongest possible protections against the intentional or inadvertent re-identification of any individuals within the data set. AoT should clarify publicly how it will ensure that the risk of re-identification is sufficiently low that individual privacy can be guaranteed. What is the acceptable threshold for re-identification risk, and how is it calculated? Will the AoT use differential privacy solutions? How will AoT handle the de-identification within image or audio files as opposed to structured textual data? Will any legal controls or commitments (such as agreements to not attempt to re-identify data) be required before accessing de-identified data? While not expected to publish every detail of its de-identification strategy or lock itself into a particular set of practices, the AoT should make known important parameters to increase trust and transparency.

3. Additionally, FPF recommends that all smart city initiatives, including the AoT, implement a variety of other organizational and technical measures to safeguard personal data, including: a. Mapping data flows, including where data is collected and how it is used throughout the entire AoT ecosystem. b. Classifying data according to sources, identifiability, sensitivity, and uses. c. Documenting processes and procedures for sharing data with third parties and monitoring vendors, including data use agreements, audit and standard contractual terms, and transparency about how and by whom scientific partners are “approved.” d. Safeguards to protect against unfair or discriminatory uses of data. e. Identifying what data sets are owned by which stakeholders, and any relevant copyright, licensing, or access provisions. f. Documenting risk-benefit assessments and structured ethical review processes for evaluating new research or uses of PII. (See, e.g., Thank you again for this opportunity to comment.

“In general you could not assume that you were much safer in the country than in London. There were no telescreens, of course, but there was always the danger of concealed microphones by which your voice might be picked up and recognized…”

~Orwell, 1984