PPar Seminar Series 2019/20

PPar Seminar Series schedule for 2019/20

IF = Informatics Forum, Central campus

AT = Appleton Tower,  Central campus

BC = Bayes Centre, Central campus

Speaker
Title
Date
Time
Venue
Organiser
Dr. Arjuna Sathiaseelan (GAIUS Networks Inc.)

GAIUS: Enabling hyperlocal content for the next three billion.

Global mobile broadband penetration is expected to reach 60% by 2020. However, the next three billion users have been facing issues accessing content that is relevant to them. The underlying market pain points are three fold: 1. There are no good platforms for enabling localised content generation and distribution especially in emerging markets 2. Current mobile web content is very poorly designed and not optimized for super fast content delivery in emerging markets 3. Localised content require sustainable content platforms where content can be easily monetised through ads. This is difficult to achieve with Google and Facebook controlling majority of the content ad ecosystem! I will present  GAIUS, a platform with an ambitious vision of solving the last mile content provisioning challenge for the next three billion users in emerging markets. GAIUS aims to bring together consumers, content producers and ad providers into a consumer focused local cloud ecosystem delivering unprecedented value to the stakeholders in terms of customer reach, revenue and performance.

10th Oct 2019 16:00-18:00 IF-G.03 ICSA
Aaron Quigley

Knowing the World “Object recognition in HCI with Radar, Vision and Touch” 

The exploration of novel sensing to facilitate new interaction modalities is an active research topic in Human-Computer Interaction. Across the breadth of HCI we can see the development of new forms of interaction underpinned by the appropriation or adaptation of sensing techniques based on the measurement of sound, light, electric fields, radio waves, biosignals etc. In this talk I will delve into three forms of sensing for object detection and interaction with radar, blurred images and touch. 

RadarCat (UIST 2016Interactions 2018IMWUT 2018 UbiComp 2019) is a small, versatile system for material and object classification which enables new forms of everyday proximate interaction with digital devices. RadarCat exploits the raw radar signals that are unique when different material and objects are placed on the sensor. By using machine learning techniques, these objects can be accurately recognized. An object’s thickness, state (filled or empty mug) and different body parts can also be recognized. This gives rise to research and applications in context-aware computing, tangible interaction (with tokens and objects), and in industrial automation (e.g., recycling), or laboratory process control (e.g., traceability). While AquaCat (MobileHCI 2017 workshop) is a low-cost radar-based system capable of discriminating between a range of liquids and powders. Further in Solinteraction we explore two research questions with radar as a platform for sensing tangible interaction with the counting, ordering, identification of objects and tracking the orientation, movement and distance of these objects. We detail the design space and practical use-cases for such interaction which allows us to identify a series of design patterns, beyond static interaction, which are continuous and dynamic with Radar.  

Beyond Radar, SpeCam (MobileHCI ’17) is a lightweight surface color and material sensing approach for mobile devices which only uses the front-facing camera and the display as a multi-spectral light source. We leverage the natural use of mobile devices (placing it face-down) to detect the material underneath and therefore infer the location or placement of the device. SpeCam can then be used to support “discreet computing” with micro-interactions to avoid the numerous distractions that users daily face with today’s mobile devices. Our two-parts study shows that SpeCam can i) recognize colors in the HSB space with 10 degrees apart near the 3 dominant colors and 4 degrees otherwise and ii) 30 types of surface materials with 99% accuracy. These findings are further supported by a spectroscopy study. Finally, we suggest a series of applications based on simple mobile micro-interactions suitable for using the phone when placed face-down with blurred images. 

Finally, with touch we can show a sensing technique for detecting finger movements on the nose, using EOG sensors embedded in the frame of a pair of eyeglasses (ISWC 2017). Eyeglasses wearers can use their fingers to exert different types of movement on the nose, such as flicking, pushing or rubbing. These subtle gestures in “discreet computing” can be used to control a wearable computer without calling attention to the user in public. We present two user studies where we test recognition accuracy for these movements. I will conclude this talk with some speculations around how touch, radar and vision processing might be used to realise “discreet” and “blended reality” interactions in AR and beyond. 

2nd Oct 2019 14:00 G.03, Inspace ICSA
Mahesh Marina

Towards an Open Mobile Network Ecosystem

Deciding the make-up of mobile network systems have traditionally been the purview of a fairly closed ecosystem of equipment vendors and operators, and deployed systems till date are largely a composition of proprietary blackbox appliances. But as we head to 5G, mobile networks are undergoing a transformation driven my several factors. In this talk, I'll give my perspective on what is causing this transformation and highlight our key research contributions to this end, in particular our recent work focusing on the two important use cases of indoor and universal mobile access. I'll also briefly outline other strands of my research from the recent past.

26th Sept 2019 16:00-17:00 IF-G.03 ICSA