Smartglasses, Social Context and Privacy

Des Walsh in smart glasses, Kirra BeachLast week I had the pleasure and privilege of trying out a pair of smartglasses. These were the M100 product from Vuzix, the first commercially available smartglasses in the world. As the Vuzix website explains:

The Vuzix M100 Smart Glasses is an Android-based wearable computer, enhanced with a wearable monocular display and computer, recording features and wireless connectivity capabilities designed for commercial, professional, and prosumer users. Its pre-installed apps can be used to record and playback still pictures and video, track timed events, manage your calendar, link to your phone and more.

They make the point that it is “designed primarily for use in enterprise, commercial and medical applications.”

I had looked enviously at the pictures of Robert Scoble wearing Google Glass (even allowing for some apparent disenchantment on his part) and read about that and other wearables in Age of Context by Robert Scoble and Shel Israel.

So being able to try out this kind of technology was a great treat.

The treat was made possible by my friends Yuval Hertzog and Uzi Bar-On, principals of the very talented and exciting local developer company Nubis, who specialise in this kind of augmented reality technology.

Social context revealed

It’s quite fascinating to look at someone through these glasses and see in a circle around their head, as in the image below, information about their social network.

alphega software driving smartglasses

But as I used the glasses, I immediately thought of those stories of restaurants and other establishments banning the wearing of such devices, so as not to upset other patrons who might be concerned about their privacy.

So what are the privacy implications for this and related technologies?

I have a long-standing interest in issues of privacy and technology. In fact, in a past government service career I was responsible for producing the first brochure for public distribution, explaining the then new Australian Federal Privacy Act.

But I make no claims to special expertise on privacy, much less on how to handle privacy in relation to these advanced technologies that we know can be used for incredibly intrusive purposes as well as for good, as in various medical uses.

There is no doubt in my mind however that Yuval and Uzi are very alert to and well informed about the privacy implications as well as about the finer points of the technology..

Yuval gave a very interesting and challenging overview of this emerging body of technology and the privacy implications in a keynote last month in Melbourne, Australia, for the Connect “Next Big Thing” expo – see Social Networks and Wearable Devices – privacy, anyone?

For people inclined to be fearful about technology, there was enough to make them anxious.

But in his keynote, while acknowledging the risks, Yuval was fundamentally optimistic – with the proviso that we all take seriously what is happening and work to see that the proper protocols and safeguards are put in place.

Interestingly for me given what I have mentioned above about my past involvement with government initiatives in the privacy field, Yuval invokes the right of privacy in Article 12 of the Universal Declaration of Human Rights, in support of his optimistic view:

No one shall be subjected to arbitrary interference with his privacy, family, home or correspondence, nor to attacks upon his honour and reputation. Everyone has the right to the protection of the law against such interference or attacks.

But let’s face it, there is no question that the technological genie is well and truly out of the bottle. And as Yuval observes, “I don’t think that you can stop technological progress nor should I think anyone need to. Technology is neither good nor bad. Its use can be, though.”

My view? Like Yuval, I’m an optimist about privacy and I believe we have the capacity as a community to take appropriate protective steps. As Yuval says,

“The long term benefits of this technology surpass the scary aspects tenfold. Privacy threat? That is as real and true as it was before and it is exactly why privacy aspects need to be considered and addressed from the start with a vision to where it will eventually go.”

Part of the solution seems to be to build into the technology the capacity to control it in the interests of privacy. For example, the people at Nubis apply their commitment to privacy protection in their ground-breaking Alphega “socioscope” (i.e. social networking periscope) product:

“With the switch of a gesture, you determine who can see your location, just as you see others. Alphega was designed with your privacy at the top priority making sure you are “visible” only when you choose and to those you select. “

Your turn

What’s your take on the future of privacy in relation to these technologies? Maybe you’re not an optimist. Anyway, please leave a comment with your thoughts and views.

Des Walsh

Business coach and digital entrepreneur. With coach training from Coachville.com and its Graduate School of Coaching, and a founding member of the International Association of Coaching, Des has been coaching business owners and entrepreneurs for the past 20 years. Over the same period he has also been actively engaged in promoting the business opportunities of the digital economy. He is a certified Neurolinguistic Programming (NLP) coach, and a certified specialist in social media strategy and affiliate marketing.

Similar Posts

2 Comments

  1. Excellent discussion, Des. I’ve started talking with clients about the potential risks associated with wearable technology, especially in organisational settings. One example – many organisations hold sensitive meetings, wherein people are allowed to have their cell phones (voice recorders) present. I foresee a raft of new policies and ways of working to deal with the risks.

  2. That’s right, Joan. And with the increasing sophistication of these devices, we’re going to have to move well beyond “please turn off your phones”.

    How will the person convening a meeting, or various attendees, know that someone does not have a watch or pen or other item recording everything that’s been said, or normal-looking specs that are in fact videoing everything that is happening (not fanciful given what has already been developed).

    You are right, we need policies and protocols. But they will not be sufficient. There is going to have to be in all such gatherings a premium placed on trust. And the penalties for a breach of such trust will have to be commensurate to the breach.

    Interesting times!

Comments are closed.