The electronic devices described herein are configured to enhance user experience associated with using a pen on the touchscreens of the electronic devices. Contact of a pen on the touch screen is detected by a persistent, pen-aware shell that occupies some or all of the touch screen user interface throughout operation of the electronic device. Detected pen contact with the touch screen is captured/collected as pen input and used to perform a function of the electronic device or shared with an application such that the application may perform a function based on the pen input. A performed function or application with which to share the pen in put is selected by a user, automatically by an operating system, or a combination of the two. Automatic capture of pen input by the persistent, pen-aware shell provides an intuitive way of making use of pen/touch screen capability of an electronic device.
The electronic devices described herein are configured to enhance user experience associated with using a pen on the touchscreens of the electronic devices. Contact of a pen on the touch screen is detected by a persistent, pen-aware shell that occupies some or all of the touch screen user interface throughout operation of the electronic device. Detected pen contact with the touch screen is captured/collected as pen input and used to perform a function of the electronic device or shared with an application such that the application may perform a function based on the pen input. A performed function or application with which to share the pen in put is selected by a user, automatically by an operating system, or a combination of the two. Automatic capture of pen input by the persistent, pen-aware shell provides an intuitive way of making use of pen/touch screen capability of an electronic device.
Context-Based Shape Extraction And Interpretation From Hand-Drawn Ink Input
- Redmond WA, US Sophie A. Beland - Seattle WA, US Oluwadara Oke - Seattle WA, US Kevin J. Jeyakumar - Bellevue WA, US
International Classification:
G06F 17/24 G06F 3/0488 H04W 4/02 G06N 99/00
Abstract:
The electronic devices described herein are configured to enhance user experience associated with drawing or otherwise inputting shape data into the electronic devices. Shape input data is identified and matched against known shape patterns and, when a match is found, an entity associated with the shape is determined. The entity is converted into an annotation for rendering and/or displaying to the user. The shape identification, entity determination, and annotation conversion may all be based on one or more context elements to increase the accuracy of the shape interpretation. In particular, elements of conversations held via the electronic devices may be used as context for the shape interpretation. Further, machine learning techniques may be applied based on a variety of feedback data to improve the accuracy, speed, and/or performance of the shape interpretation process.
The electronic devices described herein are configured to enhance user experience associated with using a pen on the touchscreens of the electronic devices. Contact of a pen on the touch screen is detected by a persistent, pen-aware shell that occupies some or all of the touch screen user interface throughout operation of the electronic device. Detected pen contact with the touch screen is captured/collected as pen input and used to perform a function of the electronic device or shared with an application such that the application may perform a function based on the pen input. A performed function or application with which to share the pen in put is selected by a user, automatically by an operating system, or a combination of the two. Automatic capture of pen input by the persistent, pen-aware shell provides an intuitive way of making use of pen/touch screen capability of an electronic device.
- Redmond WA, US Sophie A. Beland - Seattle WA, US Kevin J. Jeyakumar - Bellevue WA, US Oluwadara Oke - Seattle WA, US
International Classification:
G06F 3/0354 G06F 3/038 G06F 3/0484
Abstract:
The electronic devices described herein are configured to enhance user experience associated with using a pen on the touchscreens of the electronic devices. Proximity and/or approach of a pen is detected and, when the pen is close enough, the operating system triggers a pen event, which is communicated to applications running on the electronic devices. The applications may handle the pen event by redrawing a user interface to accommodate the incoming pen input in a smooth, seamless way. Further, pen-compatible interface controls may be provided by the operating system to enhance pen use with applications that may not have the necessary functionality.
Capturing Handwriting By A Cartridge Coupled To A Writing Implement
- Redmond WA, US Sophie A. Beland - Seattle WA, US Oluwadara Oke - Seattle WA, US Kevin J. Jeyakumar - Bellevue WA, US
International Classification:
G06K 9/22 G06F 3/0354 G06F 3/038 G06K 9/00
Abstract:
The electronic devices described herein are configured to enhance user experience associated with using a pen or other writing implement and capturing the content written or drawn. A cartridge device is coupled to or included in the pen or writing implement. The cartridge device is configured to collect pen input based on the pen being used by a user to write or draw. Collected pen input is saved on the cartridge device and, when a connection to a network storage device or cloud server is detected, the collected pen input is uploaded. The cartridge device may be associated with a user account based on user credentials, such that pen input stored and/or uploaded is associated with the user account on the cartridge device and don the network storage device or cloud server.