AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |
Back to Blog
Iphone 6 touch screen multitouch11/23/2023 ![]() ![]() Touchscreens became extensively commercialized in the early 1980s. Myron Krueger, an American computer artist who built an optical system that could capture hand gestures, pioneered gesture interaction immediately after. Nimish Mehta created the first human-controlled multitouch device at the University of Toronto in 1982. The group finally patented the first curved glass touch interface. This breakthrough paved the way for what we now know as resistive touch technology, which he and his team called elographics. He discovered that a conductive cover sheet was just the thing that the screen needed. ![]() In the next chapter, entitled An Example iOS 6 iPhone Touch, Multitouch and Tap Application we will use these concepts to create an example application that demonstrates touch screen event handling.Hurst began an after-hours investigation after returning to the Oak Ridge National Laboratory in 1970. We have covered these basics in this chapter. In order to fully appreciate the mechanisms for handling touch screen events within an iOS 6 iPhone application, it is first important to understand both the responder chain and the methods that are called on a responder depending on the type of interaction. When a gesture is interrupted due to a high level interrupt, such as the phone detecting an incoming call, the touchesCancelled method is called. As with the previous methods, touchesEnded is provided with the event and NSSet objects. This method is called when the user lifts one or more fingers from the screen. As with the touchesBegan method, this method is provided with an event object and an NSSet object containing UITouch events for each finger on the screen. As fingers move across the screen this method gets called multiple times allowing the application to track the new coordinates and touch count at regular intervals. The touchesMoved method is called when one or more fingers move across the screen. Similarly, the coordinates of an individual touch can be identified from the UITouch event either relative to the entire screen or within the local view itself. The tapCount method of any of the UITouch events within the touches set can be called to identify the number of taps, if any, performed by the user. The touches object contains a UITouch event for each finger in contact with the screen. Passed to this method are an argument called touches of type NSSet and the corresponding UIEvent object. ![]() The touchesBegan method is called when the user first touches the screen. If an event needs to be passed to the next responder, code must be written to make it happen. When working with the responder chain, it is important to note that the passing of an event from one responder to the next responder in the chain does not happen automatically. If the view is also unable to handle the event it would then be passed to the view controller and so on. ![]() If the button is unable to handle the event it will need to be passed up to the view object. If the user touches the screen over the button then the button, as first responder, will receive the event. Take, for example, a UIView with a UIButton subview. If the first responder is not able to handle the event it will also pass it to the next responder in the chain and so on until it either reaches a responder that handles the event or it reaches the end of the chain (the UIApplication object) where it will either be handled or discarded. Having handled the event, the responder then has the option of discarding that event, or passing it up to the next responder in the response chain (defined by the object’s nextResponder property) for further processing, and so on up the chain. If the first responder has been programmed to handle the type of event received it does so (for example a button may have an action defined to call a particular method when it receives a touch event). The operating system subsequently creates an event associated with the interaction and passes it into the currently active application’s event queue where it is subsequently picked up by the event loop and passed to the current first responder object the first responder being the object with which the user was interacting when this event was triggered (for example a UIButton or UIView object). When the user interacts with the touch screen of an iPhone the hardware detects the physical contact and notifies the operating system. In order to fully understand the concepts behind the handling of touch screen gestures it is first necessary to spend a little more time learning about the responder chain. In the chapter entitled Understanding iPhone iOS 6 Views, Windows and the View Hierarchy we spent some time talking about the view hierarchy of an application’s user interface and how that hierarchy also defined part of the application’s responder chain. ![]()
0 Comments
Read More
Leave a Reply. |