I♡IDORU's avatar

I♡IDORU

prostheticknowledge

prostheticknowledge:

Google Cardboard

Lo-fi simplified VR from Google I/O is an open-source solution to view steroscopic content on the web with your Android smartphone.

This isn’t the first of it’s kind (there is FOV2GO and many independent efforts) but this is certainly a decent framework and platform ready for experimentation. It’s nothing that will compete with high end technology (although that has benefited from developments in smartphone components such as gyroscopes and displays).

Below is a video from the I/O conference - it is 45 minutes long and a lot is focused on the development side, but the first 15 minutes will give you a good idea about what can be possible:

Virtual reality has made exciting progress over the past several years. However, developing for VR still requires expensive, specialized hardware. Thinking about how to make VR accessible to more people, a group of VR enthusiasts at Google experimented with using a smartphone to drive VR experiences.

The result is Cardboard, a no-frills enclosure that transforms a phone into a basic VR headset, and the accompanying open software toolkit that makes writing VR software as simple as building a web or mobile app.

By making it easy and inexpensive to experiment with VR, we hope to encourage developers to build the next generation of immersive digital experiences and make them available to everyone.

All the instructions + designs and more information can be found at the project homepage here

A kit with everything you will need with all the pieces can be ordered from Dodocase here

A collection of experiments ready for this system can be found here

(via spacenookie)

hackaday.io

prostheticknowledge:

Raspberry Eye

Wearable computing device by genericsoma uses cheap materials and a Raspberry Pi computer - video embedded below:

I wanted to do something interesting with my RPi and a 2.4” TFT LCD. Google Glass was hot in the news, so I’ve decided to hack something similar. The semi-transparent mirror was extracted from Eye of Horus Beamsplitter, and the projection lens is cut from a plastic 3x Fresnel magnifying lens. The box and mounting parts are 3D-printed from ABS. Head strap is for GoPro. One RPi USB port is used for WiFi, and second for 2.4Ghz small wireless keyboard/mouse combo. All together cost around 100$.

This is the first version which has basic functionality, but hopes to include more features such as voice recognition and augmented reality.

You can find out more at the project’s webpage here

(via spacenookie)

futuristech-info
futuristech-info:





Future wearable devices to be powered from your body heat using flexible glass fabrics

futuristech-info:

(via turnon-bootup-jackin)

If You Like Immersion, You’ll Love This Reality

The news that Facebook paid $2 billion for a virtual reality start-up, Oculus VR, might strike you as a bit zany. Like flying cars and robotic maids, the idea of donning a pair of computerized glasses and slipping into a digital world feels like a snapshot from yesterday’s future.

Is something so self-consciously geeky really worth billions of dollars? What would a nontechie nongamer do with virtual reality?

The answer: pretty much everything.

“I don’t worry anymore about whether it will be accepted by the mainstream — that will happen,” said Jeremy Bailenson, a virtual reality researcher who directs Stanford University’s Virtual Human Interaction Lab. Like many in his field, Dr. Bailenson argues that virtual reality technology is advancing so quickly that it is certain to infuse just about every corner of our lives. After trying out the technology in Dr. Bailenson’s lab this week, I believe he’s more right than wrong. Virtual reality is coming, and you’re going to jump into it.

That’s because virtual reality is the natural extension of every major technology we use today — of movies, TV, videoconferencing, the smartphone and the web. It is the ultra-immersive version of all these things, and we’ll use it exactly the same ways — to communicate, to learn, and to entertain ourselves and escape. Dr. Bailenson says that it will even alter how society deals with such weighty issues as gender parity andenvironmental destruction.

The only question is when.

Dr. Bailenson calls his lab’s advanced VR rig “one of the most intense, immersive virtual reality experiences on the planet.” In addition to running test subjects through the lab’s technology to see how people respond to virtual environments, he regularly hosts business leaders looking to experience the future of virtual reality. Just a few weeks before Facebook announced the Oculus acquisition, Mark Zuckerberg, a co-founder and the chief executive of Facebook, dropped by for a visit.

This week, in an hour-and-a-half session, Dr. Bailenson offered a series of simulations similar to the ones Mr. Zuckerberg experienced. Sometimes his guidance was physical; when I “fell” into a virtual pit as I scampered across a virtual plank, my real-life body crumpled, and Dr. Bailenson had to catch me. By the end of simulation, I was a little dazed, and my neck hurt from carrying the lab’s five-pound, $30,000 goggles, which offer a far more realistic simulation than can be achieved with smaller, cheaper headsets, like Oculus’s Rift.

But I was hooked, too. I had experienced how an immersive virtual reality simulator can play strange tricks on one’s body, mind and mood.

And I could see how Mr. Zuckerberg might have come away from the lab optimistic about the future of this technology.

As Dr. Bailenson pointed out in “Infinite Reality,” a book he co-wrote about the future of VR, humans are an escapist lot. From books to movies to video games to iPads, whenever technology has presented us with ways to jettison our worries and slip into worlds of our own making, we’ve jumped at the chance. If the tech is good enough, virtual reality will be no different.

For years, the most convincing criticism of virtual reality was that the technology just wouldn’t be good enough. That’s still the main criticism. Virtual reality devices work by sending a computer-generated image to each of your eyes in response to your movements. The simulator’s fidelity depends on how accurately it can track your movements, and how quickly it can adjust the image to match the motion. If the technology is just a little off, the simulation fails.

“If you turn your head and look over there, you’ll notice a slight discrepancy, and your brain will feel it,” said Tadhg Kelly, a game designer who writes the blog What Games Are, and who has been skeptical of VR’s ability to go mainstream. “I wonder if that feeling of dislocation will ever quite go away, and if you’ll ever be able to get immersed in the scene.”

That gets to the Achilles’ heel of VR. If you’re aware of the simulation, virtual reality will feel gimmicky, and, from Smell-O-Vision to 3-D TV to the “uncanny valley” of animated faces, the history of media is littered with failed, gimmicky efforts to create better simulations of the real world.

“The biggest question we had was, ‘Was this the right time?’ ” said Chris Dixon, an investor at the venture capital firm Andreessen Horowitz, which led a $75 million investment in Oculus VR in December. “But we found that we’re just at the right time for all the stuff to work.”

Mr. Dixon pointed out that the rise of the smartphone industry had helped push down the cost of powerful displays and tracking components required for new VR headsets, putting virtual reality on the trajectory of mobile electronics and computers, which became cheaper and more mainstream as the technology improved.

He predicted that VR would be useful even before it was perfect, and said that gaming would not necessarily be its first breakout hit. “It’s such a compelling experience that some games — especially shooting games — will be too real,” Mr. Dixon said. “It would be as if you’re actually getting shot at.”

I know what he means. When I crashed into objects at Stanford’s virtual lab, the impending impact felt so real that I often rotated my body to blunt the force. The sensation wasn’t exactly enjoyable.

Rather than exciting physical feats, I was more thrilled by the simulator’s power to push me to forge emotional connections with other virtual characters. In one simulation, I walked into a room in which there were a dozen or so people, all seated in front of me, staring at my eyes. The people didn’t look realistic; they looked like video game characters, splotches of polygons.

Yet when the researchers asked me to walk toward the group, bend my head down and touch one of the people nose to nose, I found it incredibly difficult. In a way that I’ve never experienced in a video game, I felt as if I were dealing with real people — and violating their personal space.

It’s this ability to let us feel a sense of human connection that boosters say will make virtual reality a powerful communications platform. Today, companies spend billions on travel and videoconferencing, because even though we can all get our work done remotely, face-to-face meetings are powerful.

Yet researchers have shown that virtual meetings can be even better than real-life encounters, because our avatars can be programmed to act perfectly manipulatively, in ways that we can’t. For instance, in VR encounters, everyone can make eye contact with everyone else, suggesting a level of attentiveness that might be lacking in reality.

Another potential use is pure escapism. History shows that we’ve never shied away from such immersive experiences when they work. Since their invention, movies have offered a compelling, immersive world. These days, we go outside wearing big headphones to transport us through music. And we can’t stop staring at our phones.

“If you use Oculus to look at a panoramic photo, it feels like you’re there,” Mr. Dixon said. “Who wouldn’t want to do that after a long day of work — to change your mental state by escaping into a photo?”

He added, “In some ways, the biggest competitor to virtual reality might be a bottle of wine.”_

neurosciencestuff
neurosciencestuff:

Scientists Identify Key Cells in Touch Sensation
In a study published online today in the journal Nature, a team of Columbia University Medical Center researchers led by Ellen Lumpkin, PhD, associate professor of somatosensory biology, solve an age-old mystery of touch: how cells just beneath the skin surface enable us to feel fine details and textures.
Touch is the last frontier of sensory neuroscience. The cells and molecules that initiate vision—rod and cone cells and light-sensitive receptors—have been known since the early 20th century, and the senses of smell, taste, and hearing are increasingly understood. But almost nothing is known about the cells and molecules responsible for initiating our sense of touch.
This study is the first to use optogenetics—a new method that uses light as a signaling system to turn neurons on and off on demand—on skin cells to determine how they function and communicate.
The team showed that skin cells called Merkel cells can sense touch and that they work virtually hand in glove with the skin’s neurons to create what we perceive as fine details and textures.
“These experiments are the first direct proof that Merkel cells can encode touch into neural signals that transmit information to the brain about the objects in the world around us,” Dr. Lumpkin said.
The findings not only describe a key advance in our understanding of touch sensation, but may stimulate research into loss of sensitive-touch perception.
Several conditions—including diabetes and some cancer chemotherapy treatments, as well as normal aging—are known to reduce sensitive touch. Merkel cells begin to disappear in one’s early 20s, at the same time that tactile acuity starts to decline. “No one has tested whether the loss of Merkel cells causes loss of function with aging—it could be a coincidence—but it’s a question we’re interested in pursuing,” Dr. Lumpkin said.
In the future, these findings could inform the design of new “smart” prosthetics that restore touch sensation to limb amputees, as well as introduce new targets for treating skin diseases such as chronic itch.
The study was published in conjunction with a second study by the team done in collaboration with the Scripps Research Institute. The companion study identifies a touch-activated molecule in skin cells, a gene called Piezo2, whose discovery has the potential to significantly advance the field of touch perception.
“The new findings should open up the field of skin biology and reveal how sensations are initiated,” Dr. Lumpkin said. Other types of skin cells may also play a role in sensations of touch, as well as less pleasurable skin sensations, such as itch. The same optogenetics techniques that Dr. Lumpkin’s team applied to Merkel cells can now be applied to other skin cells to answer these questions.
“It’s an exciting time in our field because there are still big questions to answer, and the tools of modern neuroscience give us a way to tackle them,” she said.

neurosciencestuff:

Scientists Identify Key Cells in Touch Sensation

In a study published online today in the journal Nature, a team of Columbia University Medical Center researchers led by Ellen Lumpkin, PhD, associate professor of somatosensory biology, solve an age-old mystery of touch: how cells just beneath the skin surface enable us to feel fine details and textures.

Touch is the last frontier of sensory neuroscience. The cells and molecules that initiate vision—rod and cone cells and light-sensitive receptors—have been known since the early 20th century, and the senses of smell, taste, and hearing are increasingly understood. But almost nothing is known about the cells and molecules responsible for initiating our sense of touch.

This study is the first to use optogenetics—a new method that uses light as a signaling system to turn neurons on and off on demand—on skin cells to determine how they function and communicate.

The team showed that skin cells called Merkel cells can sense touch and that they work virtually hand in glove with the skin’s neurons to create what we perceive as fine details and textures.

“These experiments are the first direct proof that Merkel cells can encode touch into neural signals that transmit information to the brain about the objects in the world around us,” Dr. Lumpkin said.

The findings not only describe a key advance in our understanding of touch sensation, but may stimulate research into loss of sensitive-touch perception.

Several conditions—including diabetes and some cancer chemotherapy treatments, as well as normal aging—are known to reduce sensitive touch. Merkel cells begin to disappear in one’s early 20s, at the same time that tactile acuity starts to decline. “No one has tested whether the loss of Merkel cells causes loss of function with aging—it could be a coincidence—but it’s a question we’re interested in pursuing,” Dr. Lumpkin said.

In the future, these findings could inform the design of new “smart” prosthetics that restore touch sensation to limb amputees, as well as introduce new targets for treating skin diseases such as chronic itch.

The study was published in conjunction with a second study by the team done in collaboration with the Scripps Research Institute. The companion study identifies a touch-activated molecule in skin cells, a gene called Piezo2, whose discovery has the potential to significantly advance the field of touch perception.

“The new findings should open up the field of skin biology and reveal how sensations are initiated,” Dr. Lumpkin said. Other types of skin cells may also play a role in sensations of touch, as well as less pleasurable skin sensations, such as itch. The same optogenetics techniques that Dr. Lumpkin’s team applied to Merkel cells can now be applied to other skin cells to answer these questions.

“It’s an exciting time in our field because there are still big questions to answer, and the tools of modern neuroscience give us a way to tackle them,” she said.

(via underlockkey)