The basis of machine learning and predictive analytics is not necessarily concerned with you as an individual, but rather, the probability of how you as an individual will behave given how similar you are to others you correlate to based on patterns in the data.
In the western world, we tend to think about germany rcs data privacy – both legally and culturally – from an individual standpoint. Once data is anonymized and aggregated, that is to say, disconnected from an individual (reidentification issues notwithstanding) most data privacy laws no longer apply. Yet, even if your individual identity is not at play, there are still harms that can happen to those who are deemed to be part of a group, particularly if that group is one with shared genetic traits.
Genetic data opens questions surrounding a host of traditionally protected categories of data. “privacy adjacent” territory. It’s an area that seems to be out of scope given our current privacy laws which focus solely on personal information, even as we look to those laws to help address the adverse impacts of data-driven AI systems using aggregated data.
“An algorithmic condensation of social conditions results in the acceleration of differences such as racialised deprivation or disability. AI can decide which humans are disposable.” Dan McQuillan