January 26, 2018No Comments

Data Visualization in AR

[tatsu_section bg_color= "" bg_image= "" bg_repeat= "no-repeat" bg_attachment= "scroll" bg_position= "top left" bg_size= "cover" bg_animation= "none" padding= "90px 0px 90px 0px" margin= "0px 0px 0px 0px" border= "0px 0px 0px 0px" border_color= "" bg_video= "0" bg_video_mp4_src= "" bg_video_ogg_src= "" bg_video_webm_src= "" bg_overlay= "0" overlay_color= "" full_screen= "0" section_id= "" section_class= "" section_title= "" offset_section= "" offset_value= "0px" full_screen_header_scheme= "background--dark" hide_in= "0"][tatsu_row full_width= "0" no_margin_bottom= "0" equal_height_columns= "0" gutter= "medium" column_spacing= "px" fullscreen_cols= "0" swap_cols= "0" row_id= "" row_class= "" hide_in= "0" layout= "1/1"][tatsu_column bg_color= "" bg_image= "" bg_repeat= "no-repeat" bg_attachment= "scroll" bg_position= "top left" bg_size= "cover" padding= "0px 0px 0px 0px" custom_margin= "0" margin= "0px 0px 0px 0px" border= "0px 0px 0px 0px" border_color= "" bg_video= "0" bg_video_mp4_src= "" bg_video_ogg_src= "" bg_video_webm_src= "" bg_overlay= "0" overlay_color= "" animate_overlay= "none" link_overlay= "" vertical_align= "none" column_offset= "0" offset= "0px 0px" z_index= "0" column_parallax= "0" animate= "0" animation_type= "fadeIn" animation_delay= "0" col_id= "" column_class= "" hide_in= "0" layout= "1/1"][tatsu_text max_width= "100" wrap_alignment= "center" animate= "" animation_type= "fadeIn" animation_delay= "0"]

As the next generation of computer interface emerges, it is important for our clients to adapt the technology. In response to this VirtuLabs has researched and developed methods of interaction for financial software in augmented and virtual reality.

We explored the language of communicating data in 3D to understand ways to take advantage of all dimensions in augmented reality and virtual reality to deliver information based on the user’s perspective, interest, and urgency.

Creating a mechanism to become aware of the user’s intention by analyzing the gaze through reactive design, we achieved developing a complex system for demonstrating massive amount of data and organizing it in a spatial system. The user could walk through and explore the data, click, move and interact with different data visualizations. Moving through space was used to provide different levels of detail for specific data through Z axis.

[/tatsu_text][/tatsu_column][/tatsu_row][/tatsu_section]

September 9, 2017Comments are off for this post.

Exploring landscapes using ARKit

[tatsu_section bg_color= "" bg_image= "" bg_repeat= "no-repeat" bg_attachment= "scroll" bg_position= "top left" bg_size= "cover" bg_animation= "none" padding= "15px 0px 15px 0px" margin= "0px 0px 0px 0px" border= "0px 0px 0px 0px" border_color= "" bg_video= "0" bg_video_mp4_src= "" bg_video_ogg_src= "" bg_video_webm_src= "" bg_overlay= "0" overlay_color= "" full_screen= "0" section_id= "" section_class= "" section_title= "" offset_section= "" offset_value= "0" full_screen_header_scheme= "background--dark" hide_in= "0"][tatsu_row full_width= "0" no_margin_bottom= "0" equal_height_columns= "0" gutter= "medium" column_spacing= "" fullscreen_cols= "0" swap_cols= "0" row_id= "" row_class= "" hide_in= "0" layout= "1/1"][tatsu_column bg_color= "" bg_image= "" bg_repeat= "no-repeat" bg_attachment= "scroll" bg_position= "top left" bg_size= "cover" padding= "0px 0px 0px 0px" custom_margin= "0" margin= "0px 0px 0px 0px" border= "0px 0px 0px 0px" border_color= "" bg_video= "0" bg_video_mp4_src= "" bg_video_ogg_src= "" bg_video_webm_src= "" bg_overlay= "0" overlay_color= "" animate_overlay= "none" link_overlay= "" vertical_align= "none" column_offset= "0" offset= "0px 0px" z_index= "0" column_parallax= "0" animate= "0" animation_type= "fadeIn" animation_delay= "0" col_id= "" column_class= "" hide_in= "0" layout= "1/1"][tatsu_text max_width= "100" wrap_alignment= "center" animate= "" animation_type= "fadeIn" animation_delay= "0"]

A simple case study which allows the user to roam freely using an IOS device to scan the environment. Once a virtual object is within reach, the user can then interact with it using touch controls. In this case, a score is tallied and each object. Materials were created inside substance painter and an HDR environment map was made on-site which gave the virtual objects a real reflection of the players surroundings. Animation and export was through Unity 2017.

[/tatsu_text][/tatsu_column][/tatsu_row][/tatsu_section][tatsu_section bg_color= "" bg_image= "" bg_repeat= "no-repeat" bg_attachment= "scroll" bg_position= "top left" bg_size= "cover" bg_animation= "none" padding= "15px 0px 15px 0px" margin= "0px 0px 0px 0px" border= "0px 0px 0px 0px" border_color= "" bg_video= "0" bg_video_mp4_src= "" bg_video_ogg_src= "" bg_video_webm_src= "" bg_overlay= "0" overlay_color= "" full_screen= "0" section_id= "" section_class= "" section_title= "" offset_section= "" offset_value= "0" full_screen_header_scheme= "background--dark" hide_in= "0"][tatsu_row full_width= "0" no_margin_bottom= "0" equal_height_columns= "0" gutter= "medium" column_spacing= "" fullscreen_cols= "0" swap_cols= "0" row_id= "" row_class= "" hide_in= "0" layout= "1/1"][tatsu_column bg_color= "" bg_image= "" bg_repeat= "no-repeat" bg_attachment= "scroll" bg_position= "top left" bg_size= "cover" padding= "0px 0px 0px 0px" custom_margin= "0" margin= "0px 0px 0px 0px" border= "0px 0px 0px 0px" border_color= "" bg_video= "0" bg_video_mp4_src= "" bg_video_ogg_src= "" bg_video_webm_src= "" bg_overlay= "0" overlay_color= "" animate_overlay= "none" link_overlay= "" vertical_align= "none" column_offset= "0" offset= "0px 0px" z_index= "0" column_parallax= "0" animate= "0" animation_type= "fadeIn" animation_delay= "0" col_id= "" column_class= "" hide_in= "0" layout= "1/1"][tatsu_text max_width= "100" wrap_alignment= "center" animate= "" animation_type= "fadeIn" animation_delay= "0"][/tatsu_text][/tatsu_column][/tatsu_row][/tatsu_section]

June 11, 2017Comments are off for this post.

Virtual Reality + Eye Tracking providing handsfree control in 3D worlds

[tatsu_section bg_color= "" bg_image= "" bg_repeat= "no-repeat" bg_attachment= '{"d":"scroll"}' bg_position= '{"d":"top left"}' bg_size= '{"d":"cover"}' bg_animation= "none" padding= '{"d":"15px 0px 15px 0px"}' margin= '{"d":"0px 0px 0px 0px"}' border= "0px 0px 0px 0px" border_color= "" bg_video= "0" bg_video_mp4_src= "" bg_video_ogg_src= "" bg_video_webm_src= "" bg_overlay= "0" overlay_color= "" full_screen= "0" enable_custom_height= "" custom_height= '{"d":""}' vertical_align= "center" top_divider= "none" top_divider_zindex= "9999" bottom_divider_zindex= "9999" bottom_divider= "none" top_divider_height= '{"d":100}' top_divider_position= "above" bottom_divider_height= '{"d":100}' bottom_divider_position= "below" top_divider_color= "#ffffff" bottom_divider_color= "#ffffff" invert_top_divider= "0" invert_bottom_divider= "0" flip_top_divider= "0" flip_bottom_divider= "0" section_id= "" section_class= "" section_title= "" offset_section= "" offset_value= "0" full_screen_header_scheme= "background--dark" z_index= "0" overflow= "" hide_in= "0" key= "hco7ka2f6o3ftbiv"][tatsu_row full_width= "0" bg_color= "" border= '{"d":"0px 0px 0px 0px"}' border_color= "" no_margin_bottom= "0" equal_height_columns= "0" gutter= "medium" column_spacing= "" fullscreen_cols= "0" swap_cols= "0" padding= '{"d":"0px 0px 0px 0px"}' margin= '{"d":"0px 0px"}' row_id= "" row_class= "" hide_in= "0" box_shadow= "0px 0px 0px 0px rgba(0,0,0,0)" border_radius= "0" layout= "1/1" key= "hco7ka2f922j9udu"][tatsu_column bg_color= "" bg_image= "" bg_repeat= "no-repeat" bg_attachment= "scroll" bg_position= '{"d":"top left"}' bg_size= '{"d":"cover"}' padding= '{"d":"0px 0px 0px 0px"}' custom_margin= "0" margin= '{"d":"0px 0px 0px 0px"}' border= '{"d":"0px 0px 0px 0px"}' border_color= "" border_radius= "0" enable_box_shadow= "0" box_shadow_custom= "0px 0px 0px 0px rgba(0,0,0,0)" bg_video= "0" bg_video_mp4_src= "" bg_video_ogg_src= "" bg_video_webm_src= "" bg_overlay= "0" overlay_color= "" animate_overlay= "none" link_overlay= "" vertical_align= "none" column_offset= "0" sticky= "0" offset= '{"d":"0px 0px"}' z_index= "0" top_divider= "none" bottom_divider= "none" top_divider_height= '{"d":{"d":"100","m":"0"}}' bottom_divider_height= '{"d":{"d":"100","m":"0"}}' top_divider_color= "#ffffff" bottom_divider_color= "#ffffff" flip_top_divider= "0" flip_bottom_divider= "0" left_divider= "none" left_divider_width= '{"d":{"d":"50","m":"0"}}' left_divider_color= "#ffffff" invert_left_divider= "0" right_divider= "none" right_divider_width= '{"d":{"d":"50","m":"0"}}' right_divider_color= "#ffffff" invert_right_divider= "0" column_parallax= "0" top_divider_zindex= "9999" bottom_divider_zindex= "9999" right_divider_zindex= "9999" left_divider_zindex= "9999" column_width= '{"d":"100"}' overflow= "" column_mobile_spacing= "0" animate= "0" animation_type= "fadeIn" animation_delay= "0" image_hover_effect= "none" column_hover_effect= "none" hover_box_shadow= "0px 0px 0px 0px rgba(0,0,0,0)" col_id= "" column_class= "" hide_in= "0" layout= "1/1" key= "hco7ka2fbn7p9dsz"][tatsu_text bg_color= "" max_width= '{"d":"100"}' wrap_alignment= "center" text_alignment= '{"d":"left"}' animate= "" animation_type= "fadeIn" animation_delay= "0" margin= '{"d":"0px 0px 30px 0px"}' box_shadow= "0px 0px 0px 0px rgba(0,0,0,0)" padding= '{"d":"0px 0px 0px 0px"}' border_radius= "0" key= "hco7ka2fdnc3c9bq"]

As part of their R&D initiatives, VirtuLabs has been helping Pillantas on prototyping various eye-tracking concepts, with a focus on applications that range from social media to augmented reality. Among these projects are the eye-controlled Snapchat glasses and a shared reality Hololens experience, which allows users to control physical objects, such as drones, with eye gestures in an augmented reality environment.

Furthermore, Pillantas has been exploring the use of Unity3D and eye-tracking technology for medical purposes. This groundbreaking approach enables healthcare professionals to interact with complex 3D models, simulations, and medical imaging data using only their eyes, thereby increasing efficiency and reducing the risk of contamination.

Additionally, Pillantas has been investigating the potential of eye-tracking technology in gaming. By incorporating this technology into popular game engines like Unity3D, Pillantas envisions a future where gamers can control and manipulate in-game objects using only their eyes. This innovative control scheme not only enhances immersion and gameplay but also opens up new possibilities for accessibility in gaming, catering to users with physical disabilities.

[/tatsu_text][tatsu_text bg_color= "" max_width= '{"d":"100"}' wrap_alignment= "center" text_alignment= '{"d":"left"}' animate= "" animation_type= "fadeIn" animation_delay= "0" margin= '{"d":"0px 0px 30px 0px"}' box_shadow= "0px 0px 0px 0px rgba(0,0,0,0)" padding= '{"d":"0px 0px 0px 0px"}' border_radius= "0" key= "hco7ka2hrvettojc"]

We are very happy and excited to see our client, Pillantas, shining on the stage at AWE 2017 and 2022 in San Jose. 

[/tatsu_text][tatsu_code id= "" class= "" key= "SysBglqWn"][/tatsu_code][tatsu_text bg_color= "" max_width= '{"d":"100"}' wrap_alignment= "center" text_alignment= '{"d":"left"}' animate= "" animation_type= "fadeIn" animation_delay= "0" margin= '{"d":"0px 0px 30px 0px"}' box_shadow= "0px 0px 0px 0px rgba(0,0,0,0)" padding= '{"d":"0px 0px 0px 0px"}' border_radius= "0px" key= "BJVkme9Wn"]

Pillantas' R&D portfolio showcases the immense potential of eye-tracking technology across various sectors, from healthcare to gaming. By pushing the boundaries of this technology, Pillantas is contributing to a more inclusive and interactive future for all.

[/tatsu_text][tatsu_code id= "" class= "" key= "r1kdJg9b3"][/tatsu_code][tatsu_text bg_color= "" max_width= '{"d":"100"}' wrap_alignment= "center" text_alignment= '{"d":"left"}' animate= "" animation_type= "fadeIn" animation_delay= "0" margin= '{"d":"0px 0px 30px 0px"}' box_shadow= "0px 0px 0px 0px rgba(0,0,0,0)" padding= '{"d":"0px 0px 0px 0px"}' border_radius= "0px" key= "BJ2U4ec-3"]

Footage from 2017, showcasing early research on controlling real world objects (Drones) with synchronized augmentation in Hololens.

[/tatsu_text][/tatsu_column][/tatsu_row][/tatsu_section]

November 30, 2016Comments are off for this post.

Responsive Environments in OFFF Milan – DDD 2016

[tatsu_section bg_color= "" bg_image= "" bg_repeat= "no-repeat" bg_attachment= "scroll" bg_position= "top left" bg_size= "cover" bg_animation= "none" padding= "15px 0px 15px 0px" margin= "0px 0px 0px 0px" border= "0px 0px 0px 0px" border_color= "" bg_video= "0" bg_video_mp4_src= "" bg_video_ogg_src= "" bg_video_webm_src= "" bg_overlay= "0" overlay_color= "" full_screen= "0" section_id= "" section_class= "" section_title= "" offset_section= "" offset_value= "0" full_screen_header_scheme= "background--dark" hide_in= "0"][tatsu_row full_width= "0" no_margin_bottom= "0" equal_height_columns= "0" gutter= "medium" column_spacing= "" fullscreen_cols= "0" swap_cols= "0" row_id= "" row_class= "" hide_in= "0" layout= "1/1"][tatsu_column bg_color= "" bg_image= "" bg_repeat= "no-repeat" bg_attachment= "scroll" bg_position= "top left" bg_size= "cover" padding= "0px 0px 0px 0px" custom_margin= "0" margin= "0px 0px 0px 0px" border= "0px 0px 0px 0px" border_color= "" bg_video= "0" bg_video_mp4_src= "" bg_video_ogg_src= "" bg_video_webm_src= "" bg_overlay= "0" overlay_color= "" animate_overlay= "none" link_overlay= "" vertical_align= "none" column_offset= "0" offset= "0px 0px" z_index= "0" column_parallax= "0" animate= "0" animation_type= "fadeIn" animation_delay= "0" col_id= "" column_class= "" hide_in= "0" layout= "1/1"][tatsu_text max_width= "100" wrap_alignment= "center" animate= "" animation_type= "fadeIn" animation_delay= "0"]

We had the opportunity to speak at OFFF 2016 - Digital Design Days in Milan, where Sorob talked about the future of responsive environments. The talk was done through a Hololens where the mixed reality of the audience and the presentation content was projected on the screen.

[/tatsu_text][tatsu_image image= "http://virtulabs.com/wp-content/uploads/sites/4/2016/11/14884444_661669347351792_4887599468869163579_o.jpg" image_varying_size_src= "" alignment= "none" enable_margin= "0" margin= "0px 0px 0px 0px" border_width= "0" border_color= "" id= "346" size= "full" adaptive_image= "0" rebel= "0" width= "100%%" shadow= "none" lazy_load= "0" placeholder_bg= "" animate= "0" animation_type= "fadeIn" animation_delay= "0"][/tatsu_image][tatsu_image image= "http://virtulabs.com/wp-content/uploads/sites/4/2016/11/15003370_699982266827239_8751426331588207727_o.jpg" image_varying_size_src= "" alignment= "none" enable_margin= "0" margin= "0px 0px 0px 0px" border_width= "0" border_color= "" id= "345" size= "full" adaptive_image= "0" rebel= "0" width= "100%%" shadow= "none" lazy_load= "0" placeholder_bg= "" animate= "0" animation_type= "fadeIn" animation_delay= "0"][/tatsu_image][/tatsu_column][/tatsu_row][/tatsu_section]