Archive for the ‘User Experience’ Category

There are times when you want to prevent copying

I found when developing a painting application that I did not want to allow text copying on a canvas element. With the iPad and iPhone this can be done with CSS.

.no-copy {
  -webkit-user-select: none;

Now simply add this to the elements you do not want to have the copy dialog appear.

  <canvas id="touchpaint" class="no-copy"></canvas>

Read Full Post »

I had the chance to get my hands on a Nexus One and was able to run though my touch and gesture tests at http://gregmurray.org/ipad. Given both the Nexus one and the iPhone / iPad all use a variant of Webkit I found the following :

  • ontouchstart, ontouchmove, ontouchend work fine.
  • ongesturestart, ongesturechange, ongestureend do not work.

What does this mean?

You can create some UIs that use patterns such page swipe / flick gestures as seen in my carousel widget but some other gestures like stack spread and pinch to zoom will not. See the gesture page if you want to try yourself. While limited there is still the chance to develop some HTML interfaces that work across a wide array of touch devices.

Hope for Standardized Touch Support

Let’s hope going forward that the Nexus One ( Android ) browser will expand it’s support for gestures. For now we will have to work with touch events which are the basis for the gestures. Let’s keep our fingers crossed that other touch based operating system providers expose gestures and touch events in their browsers. An even better solution would be to have gesture and touch events in a future HTML standard.

Read Full Post »

Bill Scott directed me to a catalog of user interactions that the user experience community is creating to track new patterns seen with the iPad. To that end I have begun working on some widgets to that end.

Following are two patterns from the catalog as a Stack widget implemented in using JavaScript and CSS. The JavaScript event handlers are Safari Extensions to JavaScript. I also used some Webkit CSS transforms for rotating the image and creating a light box.

Tap to Open

Pretty self explanatory. This is achieved by adding a onclick event to the top level image and expand the widget with a user tap. I added a window surrounding the expanded images with a close button styled using webkit gradients. The images are also randomly rotated plus or minus 15 degrees to make it look like a stack of images. The initial rotation and location is saved so when the stack is closed it can return to its initial state.

This pattern can be seen in any Webkit based browser or the iPad simulator.

Stack Spread

Basically a user can sneak a peak at what is inside a stack using a spread gesture and close the stack using a pinch gesture.

This gesture was the most difficult to integrate. It basically required tracking two events at both the widget and page level. When a spread or pinch gesture occurs the widget will scale accordingly. I wanted to know the location of touch events and all the changes.

You will need an iPhone or an iPad (not the simulator) to see this in action.  I created some internal hooks into the widget to allow for manual scaling of the widget so I could develop this widget using Safari 4.x.

Tracking Gestures and Touch Events in web pages

A gesture such as a spread or pinch fires the following events :

  1. touchstart ( first finger )
  2. gesturestart
  3. touchstart (second finger)
  4. gesturechange ( many times )
  5. gestureend (when second finger is removed)
  6. touchend ( second finger )
  7. touchend ( first finger)

Confused yet? I was which led me to create a gesture object that is created following the second touchstart event.

I created a wrapper for the touch events that I could use to track through the life of the gesture. The event once created looks like this.

window.currentGesture {
   touch1 :  { id : 7, y : 325 }
   touch2 :  { id : 4r, y : 200 },
   top : // reference too touch2
   bottom : // reference too touch1
   scale : 125 // distance between top and bottom

This object is build with this function :

    function touchStart( e ) {
    	if ( window.inTouch !== true  ) {
    	    window.currentGesture = {
    	        touch1 : { id : e.touches.item(0).identifier, y : e.touches.item(0).pageY }
    	} else {
    		var wc = window.currentGesture ;
           	wc.touch2 =   { id : e.touches.item(0).identifier, y : e.touches.item(0).pageY }
          	wc.scale = Math.abs( wc.touch2 - wc.touch1 );
        	if ( wc.touch2.y &gt; wc.touch1.y ) {
        	    wc.top =  wc.touch1;
        	    wc.bottom = wc.touch2;
        	} else {
                    wc.top =  wc.touch2;
        	    wc.bottom = wc.touch1;

This code gets called for each finger touch and will only complete when a gesture is in process ( window.inTouch === true ) .

The next difficulty was tracking the changes in the pageY events. At first I thought I could do this by tracking the identifiers for events though there were cases when they were the same for both so I went with an easier approach. If the number is less than the top it becomes the top and vise-versa with the bottom. Next I needed to track the changes as they happened. I thought the gesturechange event would be the event for me but it did not contain references to the touches so I instead used the touchmove events which are what the gesturechange events are built on.

    function touchMove( e ) {
        if ( window.inTouch !== true) {
        var t1 = e.changedTouches.item(0);
        if ( t1.pageY   window.currentGesture.bottom.y) {
        	window.currentGesture.bottom.y = t1.pageY;
         window.currentGesture.scale = window.currentGesture.bottom.y - window.currentGesture.top.y ;

The code above fires many times for each finger as you gesture occurs. At first I used tried to access the e.touches.item(0) but found the events were from the original touch event while I wanted the current event. The problem was solved by getting the changed events e.changedTouches.item(0).

One thing I noticed was that while the first touch may have been in the widget sometime the second might fall slightly outside and once the touchmove events went outside the widget they would not get detected. I was able to get around this by attaching the touchMove event listener to both the document and the top image.

<b>So what did we learn?</b>

  • You can detect events and gestures in a web page.
  • Touch Events are the basis for gestures
  • Touch Events can be used to find additional information about a gesture such as the location
  • In some cases an additional level of abstraction might be needed to track a gesture

Find the working touch stack widget and source and working example here.

Let me know if you have any questions comments.

Additional Reading :

iPad Interesting Moments (Bill Scott)
New Multi-touch Interactions (Luke Wroblewski)
Touching and Gesturing on the iPhone (Neil Roberts)

Read Full Post »