Tag

Data Protection

Browsing

By

The tech giant has many ways of gathering information about its users’ activity – from Prime to Alexa. But how much can it collect and what can you do to keep your life private?

rom selling books out of Jeff Bezos’s garage to a global conglomerate with a yearly revenue topping $400bn (£290bn), much of the monstrous growth of Amazon has been fuelled by its customers’ data. Continuous analysis of customer data determines, among other things, prices, suggested purchases and what profitable own-label products Amazon chooses to produce. The 200 million users who are Amazon Prime members are not only the corporation’s most valuable customers but also their richest source of user data. The more Amazon and services you use – whether it’s the shopping app, the Kindle e-reader, the Ring doorbell, Echo smart speaker or the Prime streaming service – the more their algorithms can infer what kind of person you are and what you are most likely to buy next. The firm’s software is so accomplished at prediction that third parties can hire its algorithms as a service called Amazon Forecast.

Not everyone is happy about this level of surveillance. Those who have requested their data from Amazon are astonished by the vast amounts of information they are sent, including audio files from each time they speak to the company’s voice assistant, Alexa.

Like its data-grabbing counterparts Google and Facebook, Amazon’s practices have come under the scrutiny of regulators. Last year, Amazon was hit with a $886.6m (£636m) fine for processing personal data in violation of EU data protection rules, which it is appealing against. And a recent Wired investigation showed concerning privacy and security failings at the tech giant.

So, what data does Amazon collect and share and what can you do to stop it?

The data Amazon collects, according to its privacy policy

Strict EU regulation in the General Data Protection Regulation (GDPR) and UK equivalent the Data Protection Act limit the ways personal data can be used in Europe compared with the US. But, according to Amazon’s privacy policy, the tech giant still collects a large amount of information. This covers three areas: information you give Amazon, data it collects automatically and information from other sources such as delivery data from carriers.

Amazon can collect your name, address, searches and recordings when you speak to the Alexa voice assistant. It knows your orders, content you watch on Prime, your contacts if you upload them and communications with it via email. Meanwhile, when you use its website, cookie trackers are used to “enhance your shopping experience” and improve its services, Amazon says.

Some of the data is used for “personalisation” – big tech speak for using your data to improve your online experience – but it can reveal a lot about you. For example, if you just use its online retail site via the app or website, Amazon will collect data such as purchase dates and payment and delivery information.

“From this information, Amazon can work out where you work, where you live, how you spend your leisure time and who your family and friends are,” says Rowenna Fielding, director of data protection consultancy Miss IG Geek.

At the same time, Prime Video and Fire TV information about what you watch and listen to can reveal your politics, religion, culture and economic status, says Fielding. If you use Amazon to store your photos, a facial recognition feature is enabled by default, she says. “Amazon promises not to share facial recognition data with third parties. But it makes no such commitment about other types of photo data, such as geolocation tags, device information or attributes of people and objects featured in images.”

Amazon Photos does not sell customer information and data to third parties or use content for ad targeting, an Amazon spokesperson says, insisting the feature is for ease of use. You also have the option to turn the feature off in the Amazon Photos app or on the website.

Meanwhile, Amazon’s Kindle e-reader will collect data such as what you read, when, how fast you read, what you’ve highlighted and book genres. “This could reveal a lot about your thoughts, feelings, preferences and beliefs,” says Fielding, pointing out that how often you look up words might indicate how literate you are in a certain language.

Smart speakers have been criticised by privacy advocates and devices such as Amazon’s Echo have been known to be activated accidentally. But Amazon says its Echo devices are designed to record “as little audio as possible”.

No audio is stored or sent to the cloud unless the device detects the wake word and the audio stream is closed immediately after a request has ended, an Amazon spokesperson says.

More broadly, Amazon says much of the information it collects is needed to keep its products working properly. An Amazon spokesperson says the company is “thoughtful about the information we collect”.

But it can add up to a lot of data. In 2020, a BBC investigation showed how every motion detected by its Ring doorbells and each interaction with the app is stored, including the model of phone or tablet and mobile network used. Ring can share your stored data with law enforcement, if you give your consent or if a warrant is issued.

How Amazon shares data across its own services

The more services you use, the bigger Amazon’s opportunity to collect your data. “If you have bought fully into the Amazon experience, you will share details, habits and information that the company will collect and potentially use to ‘enhance your experience’,” says Richard Hale, a senior lecturer in digital forensics at Birmingham City University.

But what exactly is shared within its own companies isn’t clear. The privacy policy section on data sharing within the Amazon group of companies is “pretty limited”, says Will Richmond-Coggan, an information and privacy law specialist at Freeths LLP. Taking this into account, he says, people should “assume that any information shared with one Amazon entity will be known to any other”.

How Amazon shares your data with third parties

Like Google and Facebook, Amazon operates an advertising network allowing advertisers to use its customer data for targeting.

“Although Amazon doesn’t share information that can directly identify someone, such as a name or email address, it does allow advertisers to target by demographic, location, interests and previous purchases,” says Paul Bischoff, privacy advocate at Comparitech.

Amazon lets other companies track users visiting its website, says Wolfie Christl, a researcher who investigates the data industry. “It lets companies such as Google and Facebook ‘tag’ people and synchronise identifiers that refer to them. These companies can then potentially better track people on the web and exchange data on them.”

Amazon says it doesn’t sell your data to third parties or use personally identifiable information such as your name or email for advertising purposes. Advertising audiences are only available within its ads systems and cannot be exported and you can opt out of ad targeting via its advertising preferences page.

What you can do to stop Amazon collecting data

Amazon’s data collection is so vast that the only way to stop it completely is not to use the service at all. That requires a lot of dedication but there are some ways to reduce the amount of data collected and shared.

If you are concerned about what Amazon knows about you, you can ask the company for a copy of your data by applying under a “data subject access request”. The Alexa assistant and Ring doorbell have their own privacy hubs that allow you to delete recordings and adjust privacy settings. Ring’s Control Centre allows you to tweak settings including who’s able to see and access your videos and personal information from a central dashboard. Speaking to Alexa, you can say: “Alexa, delete what I just said” or: “Alexa, delete everything I said today.”

Amazon says it allows customers to view their browsing and purchase history from “Your Account” and manage which items can be used for product recommendations. More broadly, you can also use privacy-focused browsers such as DuckDuckGo or Brave to stop Amazon from tracking you.

But it’s not always easy to change the settings on Amazon itself, says Chris Boyd, lead analyst at security company Malwarebytes. He recommends turning off browsing history on Amazon and opting out of interest-based ads to reduce the level of tracking by the company. Yet he warns: “You’ll likely still see ads from Amazon or encounter third-party advertisers in one form or another – they just won’t be as targeted.”

Feature Image Credit: Under scrutiny: Jeff Bezos and his empire of platforms and devices. Illustration: Philip Lay/The Observer

By

Sourced from The Guardian

By Aisling Ní Chúláin

If we’ve learned anything about new means of communication over the last century, it’s that where technology attracts people’s eyes and ears, advertisers won’t be long chasing after them.

It’s been the case with radio, cinema, TV, the Internet and social media, so it seems almost impossible that it won’t be the case in the so-called metaverse – the new fully realised, shared universe that companies like Meta are proposing to build.

In perhaps a sign of things to come, a host of brands have already dipped their toes into gaming metaverses, hosting virtual fashion shows and dropping exclusive collections in game.

Luxury fashion houses like Louis Vuitton, Valentino and Marc Jacobs have all designed digital items for the social simulation game Animal Crossing – and Balenciaga has collaborated with Fortnite on an exclusive drop of wearable skins for in-game characters, to name but a few.

‘Think about it as placement in the product instead of product placement’

But now that Meta, a targeted advertising powerhouse, has staked its claim to the metaverse, some experts are raising the alarm about the specific implications immersive advertising will have for user privacy, safety and consent.

“When you think about advertising in XR, you should think about it as placement in the product instead of product placement,” Brittan Heller, counsel with American law firm Foley Hoag and an expert in privacy and safety in immersive environments, told Euronews Next.

“The way that advertising works in these contexts is a little different because you seek out the experiences. You like the experiences,” she explained.

We’re rapidly moving into a space where your intentions and your thoughts are substantial data sets that have technological importance in a way that they didn’t before.

Brittan Heller
Human Rights Counsel – Foley Hoag LLP

“An ad in virtual reality may look like buying a designer jacket for your digital avatar [but] that’s an ad for a clothing company that you are wearing on your body”.

“It may look like buying a game that puts you into Jurassic Park – [but] what better way to advertise the movie franchise than to actually put you in the experience of being in Jurassic Park?”

What is biometric psychography?

The problem here, according to Heller, is that in the metaverse, the capability for harvesting biometric data and using that sensitive data to target ads tailored to you, goes far beyond the considerable amount of data Facebook already uses to build our consumer profiles.

If the technology that Meta is promising comes to fruition, the possibility exists that a form of targeted advertising which tracks involuntary biological responses could be proliferated.

The risk that I think we’ve learnt from Cambridge Analytica is that privacy risks come into play when you have the combination of unanticipated data sets, especially when you’re looking at emerging technology.

Brittan Heller
Human Rights Counsel – Foley Hoag LLP

For VR headsets to work in this environment, Heller says, they will have to be able to track your pupils and your eyes.

This means advertisements could be tailored according to what attracts or holds your visual attention and how you physically respond to it.

Heller has coined a term for this combination of one’s biometric information with targeted advertising: biometric psychography.

If an entity had access to biometric data such as pupil dilation, skin moistness, EKG or heart rate – bodily indicators that happen involuntarily in response to stimuli – and combined it with existing targeted advertising datasets, it would be “akin to reading your mind,” Heller said.

“The type of information you can get from somebody’s pupil dilation, for example – that can tell you whether or not somebody is telling the truth. It can tell you whether or not somebody is sexually attracted to the person that they’re seeing,” she explained.

“We’re rapidly moving into a space where your intentions and your thoughts are substantial data sets that have technological importance in a way that they didn’t before”.

“The risk that I think we’ve learnt from Cambridge Analytica is that privacy risks come into play when you have the combination of unanticipated data sets, especially when you’re looking at emerging technology”.

Regulating the metaverse

Heller believes that biometric laws in the United States are insufficient in protecting users from use or misuse of this kind of data because “biometrics laws in the States are defined by protecting your identity, not protecting your thoughts or your impulses”.

With the metaverse, the risk remains that the pace of development of the technology will outstrip the ability of institutions to regulate them effectively as has arguably been the case with social media platforms.

In light of the fact that companies hoping to build the metaverse are multinational and operate across borders, Heller believes the most effective way to deal with these issues of user protection is a “human rights based approach”.

“There are many stakeholders in this, there’s civil society, there are public groups, there are governments and then there are intergovernmental organisations as well,” she explained.

“A human rights approach has been the way that we’ve been able to bring all of these players and their concerns together and make sure that everybody is heard”.

But what can companies do to protect people in the metaverse?

If tech organisations are serious about guaranteeing users’ digital rights in immersive environments, it will depend on them being open about the technology they are developing.

“I would want companies to be more transparent with the functionality of their technologies, not just their intentions and their business plans, but how this will work,” Heller said.

“That will help lawmakers ask the questions that they need to protect the public and to cooperate with each other for trans border technology”.

By Aisling Ní Chúláin

Sourced from euronews.next

By Brian O’Doherty.

It will be interesting to see if any new ideas for solving the trans- Atlantic data privacy problem arise during the many conferences on the topic being held in over 40 countries, including Ireland, on International Data Privacy Day tomorrow.  Data privacy is a very big issue for the European institutions, but maybe less so for the Americans (and British) where data surveillance seems to be the government priority for the last 30 years or so.

It’s not only a matter of human rights, and politics: its critically important for the world economy, in the medium term.  All experts seem agreed that the move to the Cloud will bring great productivity benefits and growth opportunities for business and other sectors of all economies.  But, there is a leading impediment to this move, and that’s Security of Data… the fear in  corporations and other data owners that the privacy of their data will be lost- in effect their ownership of their data- if it is to sit on Cloud-based servers owned and operated by third parties.

The debate revolves around the rights of other parties- especially governments- to  access your data at will when you decide to transfer its location to the  cloud server from its traditional  resting place in your office computer.  The best way to secure your data is to encrypt it strongly at source and make sure it stays encrypted all the time it is stored or travelling in the Cloud, until it finally returns to your own computer or other destination designated by you. But very few tech vendors offer this facility. (One that does is the Donegal start up, Netsso.com– where I must declare an interest.!)

The matter will not be resolved this week.  But its got to be resolved soon, alongside the other two great issues of taxation of and data sharing by the tech giants. Hopefully, the Data Privacy Day in Ireland will help focus the minds and, especially, improve the understanding of the issue by the general public.

By Brian O’Doherty

CEO of Netsso (www.netsso.com )

By Shannon Williams

New research has revealed 42% of organisations across the world have experienced downtime as a result of a data loss event.

According to Acronis’ 2020 World Cyber Protection Week Survey, the high number of incidents is likely caused by the fact that while nearly 90% are backing up the IT components they’re responsible for protecting, only 41% back up daily – leaving many businesses with gaps in the valuable data available for recovery.

Acronis says the figures illustrate the new reality that traditional strategies and solutions to data protection are no longer able to keep up with the modern IT needs of individuals and organisations.

In response to this, Acronis has expanded World Backup Day – the annual holiday celebrated on March 31 as a reminder to back up data – to World Cyber Protection Week.

The annual survey, completed this year by nearly 3,000 people, gauges the protection habits of users around the globe. The findings revealed that while 91% of individuals back up data and devices, 68% still lose data as a result of accidental deletion, hardware or software failure, or an out-of-date backup.

Meanwhile, 85% of organisations aren’t backing up multiple times per day, only 15% report they are. 26% back up daily, 28% back up weekly, 20% back up monthly, and 10% aren’t backing up at all, which can mean days, weeks, or months of data lost with no possibility of complete recovery.

Of those professional users who don’t back up, nearly 50% believe backups aren’t necessary. A belief the survey contradicts: 42% of organisations reported data loss resulting in downtime this year and 41% report losing productivity or money due to data inaccessibility. Furthermore, only 17% of personal users and 20% of IT professionals follow best practices, employing hybrid backups on local media and in the cloud.

Acronis says these findings stress the importance of implementing a cyber protection strategy that includes backing up data multiple times a day and practicing the 3-2-1 backup rule: create three copies of your data (one primary copy and two backups), store copies in at least two types of storage media, and store one of these copies remotely or in the cloud.

“Individuals and organisations keep suffering from data loss and cyberattacks. Everything around us is rapidly becoming dependent on digital, and it is time for everyone to take cyber protection seriously,” explains Acronis chief cyber officer, Gaidar Magdanurov.

“Cyber protection in the digital world becomes the fifth basic human need, especially during this unprecedented time when many people must work remotely and use less secure home networks,” he says.

“It is critical to proactively implement a cyber protection strategy that ensures the safety, accessibility, privacy, authenticity, and security of all data, applications, and systems – whether you’re a home user, an IT professional, or an IT service provider.”

Cyber Protection Changes the Game

Acronis says that with increasing cyberattacks, traditional backup is no longer sufficient to protect data, applications, and systems, relying on backup alone for true business continuity is too dangerous. Cybercriminals target backup software with ransomware and try to modify backup files, which magnifies the need for authenticity verification when restoring workloads.

“It makes sense, then, that the survey indicated a universally high level of concern about cyberthreats like ransomware,” the organisations says.

The research found 88% of IT professionals reported concern over ransomware, 86% are concerned about cryptojacking, 87% are concerned about social engineering attacks like phishing, and 91% are concerned about data breaches. Among personal users, awareness and concern regarding all four of these threat types were nearly as high. In fact, compared to Acronis’ 2019 survey their concern about cyberthreats rose by 33%.

The survey also revealed a lack of insight into data management, exposing a great need for cyber protection solutions with greater visibility and analytics. The findings indicate that 30% of personal users and 12% of IT professionals wouldn’t know if their data was modified unexpectedly. 30% of personal users and 13% of IT professionals aren’t sure if their anti-malware solution stops zero-day threats. Additionally, 9% of organisations reported that they didn’t know if they experienced downtime as a result of data loss this year.

“To ensure complete protection, secure backups must be part of an organisation’s comprehensive cyber protection approach, which includes ransomware protection, disaster recovery, cybersecurity, and management tools. This deeply integrated approach also addresses the Five Vectors of Cyber Protection, delivering safety, accessibility, privacy, authenticity, and security (SAPAS) for all data, applications, and systems,” says Acronis.

World Cyber Protection Week Recommendations

“Whether you are concerned about personal files or your company’s business continuity, Acronis has five simple recommendations to ensure fast, efficient, and secure protection of your workloads.”

These include:

  • Always create backups of important data. Keep multiple copies of the backup both locally (so it’s available for fast, frequent recoveries) and in the cloud (to guarantee you have everything if a fire, flood, or disaster hits your facilities).
  • Ensure your operating systems and applications are current. Relying on outdated OSes or apps means they lack the bug fixes and security patches that help block cybercriminals from gaining access to your systems.
  • Beware suspicious email, links, and attachments. Most virus and ransomware infections are the result of social engineering techniques that trick unsuspecting individuals into opening infected email attachments or clicking on links to websites that host malware.
  • Install anti-virus, anti-malware, and anti-ransomware software while enabling automatic updates so your system is protected against malware, with the best software also able to protect against zero-day threats.
  • Consider deploying an integrated cyber protection solution that combines backup, anti-ransomware, anti-virus, vulnerability assessment and patch management in a single solution. An integrated solution increases ease of use, efficiency and reliability of protection.

By Shannon Williams

Sourced from IT Brief

By

A generation of people have now grown up seemingly constantly broadcasting their lives on Instagram, sharing their innermost thoughts on Twitter, intimate details of life on Facebook and yet the world seems shocked that we’ve lost any sense of privacy. We now live in an age when it seems every Instagram user wants to be an influencer, to be popular and envied and to not have anyone know anything about them.

Ever more apps continuously ask us to share location data, software updates ask us to share our personal details, messaging apps want to scan the most personal communications we can imagine and access our friends lists too. And all in an era where security breaches are common, where nefarious companies seek to sway elections, where our data seems to be used to target us with ads that are designed to be as personal as possible, but never creepy, and yet haunt and chase us in on online lives.

Our homes are now wire tapped, not secretly and against our will, but we pay money and eagerly await delivery of connected smart speakers. We now volunteer all manner of information to Google, our location, photos, our calendar invites, our intentions are known by a global sentient network, more than our own selves.

It’s easy to think this is all a relentless march towards the dreadful future where our personal lives are invaded, where privacy is dead, where we can’t escape the filter bubble, where personalized ads follow us around like Minority Report, with few marketers aware it was a film about a dystopian future, not what should be done.

While we may hate personalization, the only thing we dislike more is irrelevance. We hate it when we phone up credit card companies and they don’t immediately know it’s us. We can’t imagine a world without Google offering us better search results based on our browsing history, we like that our weather is automatically shown in our location. Most people would happily swap mesothelioma class action lawsuit TV ads for a well-made commercial for some trendy new jeans.

The marketing and business world has long tip toed around the edge of the privacy debate. We take as much data as we can, whenever we can, we store it badly and hope to never awake the beast that is the customer. If we were to work around earning data from people, by giving them trust that we will use it wisely, not sell it, keep it massively securely and offer clear value in exchange, then life would be very different.

I’d love to see the world embrace privacy trading. How do we maximize the value offered to people in return for storing limited and intimate data about people in a transparent and trusted manner?

Uber knows that the only way for the app to work is to know where you are precisely and in real-time and we understand that and allow it. We know Google Traffic knows our location but uses it anonymously to process all traffic conditions and we’re fine with the net benefit. Dating apps track our location because sharing that is a small price to pay for life or evening long romance.

I like the thought experience of a post privacy world. Maybe I’m naive but if my airline knew exactly where I was at all times then it would be able to serve me better, to come and find me if I’m in the lounge and keep the plane from leaving without me. If my credit card company knew the same could it stop declining payments because I’m abroad and didn’t tell them? If my TV set knew I was in the market for a new car, new auto insurance and I liked leather manbags, is that a terrible world to live in? What if retailers had my face stored on file and I could pay for things with a smile? What if Uber could access my calendar and offer me cars when I’m running late? What if a hotel company could tell from my voice on phone calls I’m stressed and suggest a spa for me? What if a burger joint could tell I was hungry and not been there and entice me in with a special offer? What if a clothing retailer knew my size?

It’s easy to use the slippery slope argument against this and to assume that we can’t control a precise level of privacy. A company knowing you’ve bought a TV is one thing; knowing your blood test results or genetic code is absolutely another. If health insurers, for example, could ever access some of this information, we’d have absolute mayhem.

Yet the privacy debate is rooted in paranoia. It assumes companies want to know everything and not merely enough and likely in an anonymous way. It assumes advertisers want to build rich personal files and harass customers near endlessly. And given this has been so far how we’ve acted it’s easy to see why.

I’d love a discussion driven less by technology and language like targeting, and one driven by empathy and about serving people better. I’d love to see how we can start the process of asking permission, clear opt ins, clear trust, world class security protocols, and above all else a way to maximize the value exchange over a lifetime for all. Privacy is a recent invention, it’s perhaps the ultimate luxury for the future, but will it matter. Will our kids miss something like privacy, a concept they’ve probably never known.

Feature Image Credit: online information being given freely – picture from Pexels

By

Tom Goodwin is head of innovation at Zenith Media. A writer and speaker, Goodwin is the author of Digital Darwinism: Survival of the Fittest in the Age of Business Disruption. Previously, he has spoken at leading conferences and industry events around world, including Cannes Lions and CES.

Sourced from The Drum