Search

Marcus P Ghaly

from Kirkland, WA
Age ~44

Marcus Ghaly Phones & Addresses

    s
  • 927 5Th Ave #E4, Kirkland, WA 98033
  • 123 112Th Ave, Bellevue, WA 98004 (425) 454-7062
  • Seattle, WA

Resumes

Resumes

Marcus Ghaly Photo 1

Senior Interaction Designer At Identitymine

View page
Position:
Senior Interaction Designer at IdentityMine
Location:
Greater Seattle Area
Industry:
Design
Work:
IdentityMine since Sep 2012
Senior Interaction Designer

Research In Motion / The Astonishing Tribe - Malmö, Sweden Jan 2012 - Jun 2012
Thesis Worker

Malmö University Sep 2010 - Jun 2012
Master's Degree - Interaction Design

Sony Online Entertainment - Seattle Apr 2008 - Aug 2010
Environment Co-Lead

Sony Online Entertainment Oct 2007 - Mar 2008
Senior Environment Artist
Education:
Malmö University 2010 - 2012
Masters, Interaction Design
University of Washington 1998 - 2003
BA, Interdisciplinary Visual Arts
Marcus Ghaly Photo 2

Marcus Ghaly

View page

Business Records

Name / Title
Company / Classification
Phones & Addresses
Marcus P Ghaly
Secretary
TOP RIDGE DEVELOPMENT CORPORATION
Bellevue, WA 98006

Publications

Us Patents

Customizing Tabs Using Visual Modifications

View page
US Patent:
20180246624, Aug 30, 2018
Filed:
Feb 24, 2017
Appl. No.:
15/441748
Inventors:
- Redmond WA, US
DANIELLE LAUREN ELLBOGEN - Seattle WA, US
MARCUS P. GHALY - Kirkland WA, US
CHRISTOPHER OBESO - Duvall WA, US
ANDREW M. PICKARD - Seattle WA, US
International Classification:
G06F 3/0483
G06F 3/0482
G06F 3/0484
G06F 17/30
Abstract:
Systems and methods are provided for customizing tabs in a browser window by facilitating visual modifications to the tabs. An indication of a user interaction with a tab, such as a hover or selection input, is received. Based on receiving this indication of a user interaction, one or more options from which a user may select to modify or add content to the tab are provided for display. These options may include, for instance, colors, icons, text modifications, objects, etc. A user selection is received from the one or more options. Based on the user selection, the tab is visually altered.

Selection Of Objects In Three-Dimensional Space

View page
US Patent:
20180004283, Jan 4, 2018
Filed:
Jun 29, 2016
Appl. No.:
15/197484
Inventors:
Cheyne Rory Quin Mathey-Owens - Seattle WA, US
Arthur Tomlin - Kirkland WA, US
Marcus Ghaly - Kirkland WA, US
International Classification:
G06F 3/01
G06F 3/0481
G06F 3/0482
G06T 19/00
Abstract:
A user may select or interact with objects in a scene using gaze tracking and movement tracking. In some examples, the scene may comprise a virtual reality scene or a mixed reality scene. A user may move an input object in an environment and be facing in a direction towards the movement of the input object. A computing device may use sensors to obtain movement data corresponding to the movement of the input object, and gaze tracking data including to a location of eyes of the user. One or more modules of the computing device may use the movement data and gaze tracking data to determine a three-dimensional selection space in the scene. In some examples, objects included in the three-dimensional selection space may be selected or otherwise interacted with.

Virtual Cues For Augmented-Reality Pose Alignment

View page
US Patent:
20170287221, Oct 5, 2017
Filed:
Oct 27, 2016
Appl. No.:
15/336392
Inventors:
Marcus Ghaly - Kirkland WA, US
Andrew Jackson - Kirkland WA, US
Jeff Smith - Duvall WA, US
Michael Scavezze - Bellevue WA, US
Cameron Brown - Redmond WA, US
Charlene Jeune - Redmond WA, US
International Classification:
G06T 19/00
G02B 27/01
H04N 13/00
G06T 7/00
H04N 13/02
Abstract:
A method includes determining a current pose of an augmented reality device in a physical space, and visually presenting, via a display of the augmented reality device, an augmented-reality view of the physical space including a predetermined pose cue indicating a predetermined pose in the physical space and a current pose cue indicating the current pose in the physical space.

Updating Mixed Reality Thumbnails

View page
US Patent:
20170200312, Jul 13, 2017
Filed:
Jan 11, 2016
Appl. No.:
14/992934
Inventors:
Jeff Smith - Duvall WA, US
Cameron Graeme Brown - Bellevue WA, US
Marcus Ghaly - Kirkland WA, US
Andrei A. Borodin - Bellevue WA, US
Jonathan Gill - Seattle WA, US
Cheyne Rory Quin Mathey-Owens - Seattle WA, US
Andrew Jackson - Kirkland WA, US
International Classification:
G06T 19/00
G06T 13/20
G02B 27/01
G06T 15/50
Abstract:
Examples are disclosed herein that relate to the display of mixed reality imagery. One example provides a mixed reality computing device comprising an image sensor, a display device, a storage device comprising instructions, and a processor. The instructions are executable to receive an image of a physical environment, store the image, render a three-dimensional virtual model, form a mixed reality thumbnail image by compositing a view of the three-dimensional virtual model and the image, and display the mixed reality thumbnail image. The instructions are further executable to receive a user input updating the three-dimensional virtual model, render the updated three-dimensional virtual model, update the mixed reality thumbnail image by compositing a view of the updated three-dimensional virtual model and the image of the physical environment, and display the mixed reality thumbnail image including the updated three-dimensional virtual model composited with the image of the physical environment.

Portable Holographic User Interface For An Interactive 3D Environment

View page
US Patent:
20170052507, Feb 23, 2017
Filed:
Aug 21, 2015
Appl. No.:
14/832961
Inventors:
Adam Gabriel Poulos - Sammamish WA, US
Cameron Graeme Brown - Bellevue WA, US
Aaron Daniel Krauss - Snoqualmie WA, US
Marcus Ghaly - Kirkland WA, US
Michael Thomas - Redmond WA, US
Jonathan Paulovich - Redmond WA, US
Daniel Joseph McCulloch - Snohomish WA, US
International Classification:
G03H 1/00
G03H 1/22
G06T 19/00
G02B 27/01
Abstract:
Disclosed are a method and corresponding apparatus to enable a user of a display system to manipulate holographic objects. Multiple holographic user interface objects capable of being independently manipulated by a user are displayed to the user, overlaid on a real-world view of a 3D physical space in which the user is located. In response to a first user action, the holographic user interface objects are made to appear to be combined into a holographic container object that appears at a first location in the 3D physical space. In response to the first user action or a second user action, the holographic container object is made to appear to relocate to a second location in the 3D physical space. The holographic user interface objects are then made to appear to deploy from the holographic container object when the holographic container object appears to be located at the second location.

Building Holographic Content Using Holographic Tools

View page
US Patent:
20160210781, Jul 21, 2016
Filed:
Jan 20, 2015
Appl. No.:
14/600886
Inventors:
Michael Thomas - Redmond WA, US
Jonathan Paulovich - Redmond WA, US
Adam G. Poulos - Sammamish WA, US
Omer Bilal Orhan - Bellevue WA, US
Marcus Ghaly - Kirkland WA, US
Cameron G. Brown - Bellevue WA, US
Nicholas Gervase Fajt - Seattle WA, US
Matthew Kaplan - Seattle WA, US
International Classification:
G06T 19/00
G06T 7/40
G06T 17/10
G02B 27/01
Abstract:
A system and method are disclosed for building virtual content from within a virtual environment using virtual tools to build and modify the virtual content.

Assisted Object Placement In A Three-Dimensional Visualization System

View page
US Patent:
20160179336, Jun 23, 2016
Filed:
Jan 30, 2015
Appl. No.:
14/611005
Inventors:
Anthony Ambrus - Seattle WA, US
Marcus Ghaly - Kirkland WA, US
Adam Poulos - Sammamish WA, US
Michael Thomas - Redmond WA, US
Jon Paulovich - Redmond WA, US
International Classification:
G06F 3/0481
G06F 3/0484
G02B 27/01
G06F 3/01
Abstract:
Disclosed is a method, implemented in a visualization device, to assist a user in placing 3D objects. In certain embodiments the method includes displaying, on a display area of the visualization device, to a user, various virtual 3D objects overlaid on a real-world view of a 3D physical space. The method can further include a holding function, in which a first object, of the various virtual 3D objects, is displayed on the display area so that it appears to move through the 3D physical space in response to input from the user, which may be merely a change in the user's gaze direction. A second object is then identified as a target object for a snap function, based on the detected gaze of the user, the snap function being an operation that causes the first object to move to a location on a surface of the target object.
Marcus P Ghaly from Kirkland, WA, age ~44 Get Report