Subscribe to RSS - Data privacy

Data privacy

Discovered at DEFCON 27: automated license plate readers (ALPRs) being hoodwinked by clothing

 - 
Wednesday, August 14, 2019

It seems Joe Public is shouting “privacy here, privacy there, privacy everywhere,” as people are pushing back against certain technologies that could, or people believe could, misidentify them and track, monitor and record their actions, or be the catalyst to their personal information and identity being stolen.

It’s a double-edged sword really; people want to use the technology to ensure safety and security, but at the same time, they want no interference with their privacy. It’s all or nothing. Unfortunately, we aren’t at a point with technology where “good” people are automatically excluded from the “bad.” However, one solution to protect privacy presented itself about a week ago at none other than DEFCON 27

As over 25,000 security professionals and researchers, federal government employees, lawyers, journalists, and of course, hackers with an interest in anything and everything that can be hacked descended on Las Vegas’ Paris, Bally’s, Flamingo and Planet Hollywood Convention Centers, professional ethical hacker and now, fashion designer, Kate Rose, debuted her weapon of choice against ALPRs and surveillance — t-shirts, hoodies, jackets, dresses and skirts. 

Knows as Adversarial Fashion, each garment is purposely designed to trigger ALPRs and inject data rubbish into systems used by states and its contractors, believed by some to monitor and track civilians. Rose tested a series of modified license plate images with commercial ALPR APIs and created fabric patterns that read into LPRs as if they are authentic license plates. Priced at no more than 50 bucks, tops, you too can now fool ALPRs with your clothes! 

Don’t feel like shelling out your hard-earned money? Not to worry! Rose lists all the resources needed to make your own computer vision-triggering fashion and fabric designs on her site, along with a hyperlinked list of libraries and APIs, image editing tools, color palette extraction tools and textile pattern tutorials. In addition, slides from her DEFCON 27 Crypto and Privacy Village talk, “Sartorial Hacking to Combat Surveillance,” offering the following how-to guide of designing your own anti-surveillance clothes: 

  1. Choose a recognition system and experiment with design constraints, starting with high confidence images.
  2. Test tolerances by making slight modifications to source images. 
  3. Make notes of “cue” attributes that affect confidence scores. 
  4. Plot enough images to determine what seems to work. 
  5. Use images that work to design a pattern and digitally print it onto fabric. 

I’m not too sure if this is a 5-step method to early retirement, but I can say people are demanding privacy and obviously, being very creative in their fight for it. 

 

The eavesdropping Alexa … is it really that much of a shock?

 - 
Wednesday, May 15, 2019

For the past few weeks, I have been rather intrigued with IoT devices, smart homes, and security and safety of people in this context. (After all, aren’t our homes supposed to be our safe haven … our place of escape from the crazy, hurried world we live in?) After perusing the internet regarding this topic, I thought I had read about almost everything imaginable, but I was thrown a curve ball by a man, Geoffrey A. Fowler, technology columnist, The Washington Post, who literally made a song out of the recordings Alexa had of him! (Click here to listen.) 

Fowler reported that he listened to four years of his Alexa archive that highlighted fragments of his life: spaghetti-timer requests, houseguests joking and random snippets of a once-popular TV show. Alexa even captured and recorded sensitive conversations—a family discussion about medication and a friend conducting a business deal—apparently triggered by Alexa’s “wake word” to start recording. So, why are tech companies recording and saving our voice data? According to Amazon, “when using an Alexa-enabled device, the voice recordings associated with your account are used to improve the accuracy of the results.” 

Fact or fiction? Maybe both, because another main reason is to train their artificial intelligence (AI). 

I may be going out on a limb here, but if people’s voice data is being recorded and USED without their knowledge, isn’t this an invasion of privacy? I say, “Yes, without a doubt!” Not only that, but shouldn’t these tech companies hire and pay people for their voice data to train their AI? I mean, “free” saves the companies money, but to the extent of people’s private conversations and information being recorded and used without permission?  

So, what can be done? Defeating the purpose of Alexa would be to mute its microphone or unplug it, but, in my opinion, if I was going to have a private conversation, that would be better than putting my personal business out there. Another option would be to delete Alexa voice recordings, but Amazon warns

  • “If you delete voice recordings, it could degrade your experience when using the device.” 
  • “Deleting voice recordings does not delete your Alexa Messages.” 
  • “You may be able to review and play back voice recordings as the deletion request is being processed.” 

(I wonder what a “degraded Alexa experience” entails and I also wonder how long it takes to process a deletion request, as during this time voice data can be used.)

For me personally, I will stick with the “old-fashioned” way of living to preserve and protect my privacy—physically stand up, walk over to the window and close/open the blinds by hand; set alarms manually on my smartphone or built-in timer on my microwave; and even use the remote to turn the TV off and on, change channels and control the volume. 

By the way, don’t forget to listen to your own Alexa archive here or in the Alexa app: Settings > Alexa Account > Alexa Privacy. What all does Alexa have on you? 

 

Data privacy more important than ever before

 - 
03/20/2019

YARMOUTH, Maine—With data privacy taking center stage both in Europe with the Global Data Protection Regulation (GDPR) and here in the U.S.