Showing posts with label memex. Show all posts
Showing posts with label memex. Show all posts

Friday, March 2, 2007

MyLifeBits

Microsoft has been working on a gem of a project for the last few years. It’s called MyLifeBits, and it is featured in the March issue of Scientific American.

What is it?

Basically, it’s an attempt to record and digitize information from all modalities in your life. These include books, photos, videos, emails, text, phone calls, locations and travels, websites you encounter, and also internal things like heart rate, breath rate and/or cessation, and essentially, the sky is the limit. It aims to take advantage of ubiquitous sensors and computation devices so that there is no cognitive burden on the user. The attempt goes further to organize this enormous collection of sets of continuous data into a software interface that serves as a UI for both looking back on people’s lives and for recognizing low-level patterns and suggesting possible changes (like in work productivity or possible health alerts). All in all, it attempts to evangelize Vannevar Bush’s Memex machine from 1945- back when the technology for his idea wasn’t yet prudent. Although the idea is intriguing and quite possibly the most revolutionary technological interaction in human history, I have my concerns- some of which I can imagine difficult, albeit possible, solutions for, and some of which I can’t…
First and foremost, is the issue of privacy and security. If you think security is important now, with your bank account data and social security number tucked away inside your PC, imagine what the security threat would be like for ALL of your personal data! Your health records, legal matters, financial information, interpersonal transactions, confidential job matters, private activities, and all the like will be fair game if someone has access to your PC. It’s decently safe to say that this could be a potential show stopper for MyLifeBits ever becoming fully embraced by the public. But there may be security breakthroughs in the future that I just can’t account for, so for now we can pretend that a solution exists for this problem.

Privacy, on the other hand, is a much more fuzzy problem. Even if you were able to protect files from hackers, there is still the issue of assigning semantic interpretations to content and handling content that is available to people and not available to people (if there is indeed a function that allows you to share some of your life bits info with others). Who has access? How do you control negotiated access to parts of your life? We are already having difficult solving this problem with simple cell phones and socially-collaborative websites. This problem will only be exacerbated 100-fold with the amounts and types of information from this project.

Then there’s the issue of controlling which content you even want AT ALL! A person may not want some parts of his/her life recorded at times, for reasons personal only to them.

For instance, to use a bold example- let’s say you wanted to engage in a plot to commit some crime. You probably don’t want information related to this activity to be recorded for fear of it being confiscated if you are suspected of committing it. You would not want investigators to see that indeed you were at this GPS location at the time of the crime, where you came from and where you went afterwards, what activities you engaged in before it was committed, and all the other possible damning evidence. This activity is an extreme one and, hopefully and unlikely scenario, but it paints the picture I am trying to show. There are many sides to our lives, some large and most small, that we prefer not to share with people- for whatever reason.

This is the same reason why people are so weary about possible technologies emerging that promise to one day be able to read people’s conscious thoughts. People aren’t comfortable with their thoughts being observed and broadcasted to the world. I fear that they also won’t be comfortable with every internal and external facet of their lives being available for dissection and possible judgments by others…People have a lot of ugly thoughts and behaviors. It’s just the reality of things. It’s what makes our ability to inhibit our primal urges and bad habits that makes us who we are today. How degrading is it to think that our ability to behave in a socially inhibitive manner is completely moot when the data is analyzed from our thoughts and behaviors. There should be some way to start and stop incoming information at certain times…but then that negates the purpose of ubiquitous and continuous technology! And some of the sensors may not allow for easy start and stop functionality…

Complicating the problem even further, if we add in social sharing functionality and negotiated access, we will encounter awkward situations is our friends and loved ones wonder why parts of our data stream are not complete. A spouse may inquire why you weren’t logging your life bits yesterday from the time of 3:00 to 6:00 in the afternoon. Basically, the system would be broadcasting what times you are comfortable with people knowing everything about your life and what times you are not. Very awkward.

How about memories that you don’t necessarily want to remember? In the description of MyLifeBits in Microsoft’s PowerPoint presentation, they make note of a slideshow/screensaver functionality that will at time reflect upon one’s life memories. This sounds nice. But what if it displays an image or video of a horrific automobile accident? Will I want to see that? Or what if it shows me memory content of my grandmother when she was on her death bed. I may not want to review this life bits. There may be lots of other more serious life bits that we wouldn’t want to encounter in our lives. Doesn’t this sound like an aggravation of Post Traumatic Stress Disorder? Would a rape victim want to review the horrible act? Would a retired soldier want to review horrible scenes from their memories of war?

You might think, “Well, we could just assign trauma tags to these memories that we don’t want to review so that it doesn’t pop up in screensavers.” That’s a good idea, but the sad truth is that we may not be able to find the data that we know to be negative affecting. Even at the time of an act, such as seeing that horrific car accident, I might not know that this scene will be emotionally disruptive when it pops up no my radar again,. The neuroscience of emotional signaling is still very rudimentary and is not all that conscious to us…

If Microsoft can figure out ingenious ways to solve the above issues and concerns, then that would probably be the hard part. The easier yet just as important parts will be issues such as usage and usability of the interface, organization methods that easily allow for associative combinations and changes to organization structure, and intelligent software algorithms that can learn and autonomously discover meaningful patterns from low-level data. The latter sounds like a job for Jeff Hawkins and his theory of applying Hierarchical Temporal Memory (HTM) systems.