Tangible Interaction

Tangible interaction is a type of user experience (UX) that links physical objects with digital information. UX designers create tangible user interfaces (TUIs) to let users interact with digital systems by touching, moving, or manipulating real-world items. This approach makes digital content more intuitive, engaging, and memorable.
Explore the exciting area of design for touch and haptics—the use of devices that stimulate the sense of motion and touch—in this video with Alan Dix: Author of the bestselling book “Human-Computer Interaction” and Director of the Computational Foundry at Swansea University:
Copyright holder: On Demand News-April Brown _ Appearance time: 04:42 - 04:57 _ Link: https://www.youtube.com/watch?v=LGXMTwcEqA4
Copyright holder: Ultraleap _ Appearance time: 05:08 - 05:15 _ Link: https://www.youtube.com/watch?v=GDra4IJmJN0&ab_channel=Ultraleap
Screens may have become synonymous with human-computer interaction, but humans are naturally used to more in their world than peering at images and text or thumbing at icons and buttons on rectangular surfaces. Tangible interaction design (TID) blends the digital world with physical experience. Designers access a whole new realm of UX design when they let users manipulate data by moving, touching, or arranging real-world objects.
Still, tangible UX design isn’t a new “discipline,” especially as humans have been using physical controls for centuries. In the previous century, some fundamental electronic devices enabled users to enjoy many kinds of experiences, some of them incorporating haptic feedback such as game controllers that vibrated when players suffered “wounds” or setbacks. Design for tangible interaction endures in many examples, such as:
1. Computer Mice
The mouse is a classic tangible interface; it was the “natural” choice for users to move a controller physically to control a pointer on the screen. The scroll wheel and buttons offer direct tactile feedback and make digital navigation feel immediate and intuitive.
2. Game Controllers
Gamepads, joysticks, and motion-sensitive controllers—like the Nintendo Wii Remote or PlayStation Move—let users control complex digital environments through physical gestures, tilts, and pressure-sensitive buttons.
3. Touchscreens with Physical Styluses
Devices like iPads or drawing tablets that use a stylus let users “draw” or “write” digitally in a way that mimics real-world tools. The stylus becomes a tangible bridge between the user’s hand and digital content.
4. Smart Home Dials (like Nest Thermostat)
As with older, “analogue” dials, users rotate a physical dial to adjust the digital temperature setting. They receive immediate tactile and visual feedback, so blending physical interaction with smart home interfaces.
5. Digital Musical Instruments (like Reactable)
These instruments allow musicians to manipulate sound through pressure, shape, or object placement, and turn abstract sound design into tactile, performative experiences.
The Reactable might resemble a gambling hall game to some, but this round music table enables multiple users to add sound modules by placing and rotating plastic items. The system updates instantly, creating a live, improvisational experience. You can bet it sounds interesting.
© Organology, Fair use
“Matter” is the keyword; it’s the solid stuff that brings the nature of interaction closer to the human experience than screen-based activities. Tangible design means designers empower users to reach beyond screens—touch interfaces are immersive, but TUIs extend this engagement into physical space. Tangible interaction transforms bits (digital data) into atoms (physical forms). When digital data takes on a physical form, users can interact with it in more natural, meaningful, and memorable ways.
This method, or style, takes inspiration from the physical world. Rather than tap virtual buttons, users might rotate a knob to zoom in on a map or move a block to trigger a change on a screen. Tangible interaction offers unique value across many user experience challenges. Designers who build tools for education, industrial training, collaboration, or accessibility can find this approach helps create systems that feel intuitive and immediate to users.
When users interact with familiar physical objects, they rely less on short-term memory and can be more “themselves” as they do what comes naturally. Instead of recalling abstract commands or navigating complex menus, they use gestures like rotating, sliding, or grouping objects to perform tasks. This reduces mental effort and aligns with how people naturally think and act. Spatial memory—how users remember locations or arrangements—helps users retain information and return to tasks quickly.
Explore the power of recognition rather than recall, along with other important heuristics that help guide designers as they fine-tune better digital products, in this video with William Hudson, User Experience Strategist and Founder of Syntagm Ltd:
Tangible interfaces support “embodied cognition”—the idea that learning happens better when the body is involved. For example, when young children learn to fit shapes through corresponding holes in educational toy frames or build words with physical tiles, they pick up and grasp important concepts (literally and figuratively). Students who explore geography with interactive maps can remember content more effectively, too. Physical actions like rotating, stacking, or placing reinforce learning as they engage multiple senses. Conversely, information in a book or screen limits the knowledge transfer to sight and—for screens equipped with it—sound.
UX designers who work in education or training can tap the potential of tangible interaction for its ability to turn abstract ideas into concrete, hands-on experiences that can sink in faster for users.
Tangible systems tend to be fun and immersive. Users of all ages can savor a sense of novelty and joy as they interact with systems that respond to physical gestures. Because it makes the digital world feel more real, tangible interaction boosts emotional engagement. In public installations, museums, or creative software, this can turn casual users into active participants. The “take you there” element of feeling catapulted into the past makes displays far more relatable—visitors can get into the experience much further than they could if they were to observe a printed board or plaque set beside an exhibit. From dinosaurs to medieval knights, the added element of tangibility helps bring them closer while appealing to the childlike curiosity even the oldest visitors can enjoy as they interact with devices like wheels and pucks.
This smart tangible puck can move items on the display even when users lift it off the touch table, communicating wirelessly with the multitouch table—offering enormous, monumental, and maybe even mammoth potential in settings like museums.
© Ideum and SMK - Statens Museum for Kunst, Fair use
Physical objects naturally invite group interaction and bring individuals together. This “show and tell” factor can spark enthusiasm and participation in a way that’s natural and easier than, for example, when one person just refers to an abstract concept and hopes others join in. Tangible systems like the Reactable music interface or the metaDESK digital table allow multiple users to gather around, manipulate shared tokens, and contribute equally and in a natural way that offers a seamless experience. The spatial nature of these systems makes roles and actions visible to everyone; it promotes shared understanding and reduces confusion.
This physical presence supports non-verbal communication, too—users can see each other’s hands moving, anticipate actions, and adjust accordingly. UX designers who build for collaborative settings—like educational teams, healthcare providers, or co-creative workplaces—can use tangible interaction to create shared, low-friction environments and make teamwork more “human” in the process.
Tangible interfaces help users with disabilities, particularly motor, visual, or cognitive disabilities. For example, someone with limited vision can use shape, texture, and physical location to interact with content and appreciate it more profoundly than if they were just to hear about that content using assistive technology, for example. Also, large tokens with distinct forms and tactile feedback can improve accuracy and independence.
Designers can integrate voice prompts, haptic responses, or auditory cues to complement physical feedback, too. This multimodal approach offers more entry points, improves inclusivity, and ensures that everyone can tap into more immersive experiences in more ways.
Discover why accessible design is vital and how it helps all users enjoy better, more thoughtful products from brands:
Naturally, physical interaction mirrors how people already interact with the world—for example, whenever they turn knobs, slide levers, stack blocks, or arrange objects. These gestures feel familiar and require little or no training; humans are “built” to move things. They also offer more nuance than traditional inputs—permitting a multifunctional aspect. For instance, a user can rotate an object to change speed or direction, depending on how they turn it—think of fast versus slow turns, or turning versus turning while pushing in—a convenience that’s hard to replicate on the “flatness” of touchscreens.
Tangible interaction helps designers connect real-world processes to digital functions—bridges that enable designers to build interfaces that feel context-aware and responsive. For example, when a user turns a physical knob to dim smart lights or places an object on a table to launch a digital presentation, they’re using tangible interaction to control a digital process through physical input. This bridge works in both directions:
Physical → Digital: For example, place a tool on a desk and, thanks to sensors, a screen can show maintenance instructions for that tool.
Digital → Physical: For example, a fitness tracker vibrates and flashes a light when you reach a daily step goal, turning digital data into a physical response that provides immediate, tangible feedback.
UX designers build successful tangible systems when they follow a few foundational principles:
Every physical action should lead to a clear digital reaction. For example, users rotate a dial to change a setting or move a block to reveal new content. Tight coupling between physical cause and digital effect helps users understand what their gestures mean, and result in, right away.
Physical layout can represent digital structure—for example, designers might allow users to line up blocks to form a sequence or group objects to cluster ideas. These spatial choices make abstract structures visible and manipulable.
This principle focuses on how people use their bodies—not just their fingers. Gestures like twisting, pushing, and leaning can indicate intent to systems that read whole-body posture or movement, so the latter can provide rich, immersive feedback. The world of XR (extended reality), including AR (augmented reality) and VR (virtual reality) taps into this, too.
Explore the exciting landscape of virtual reality in this video about how it has evolved and the virtually limitless possibilities it can offer both designers and users:
Real-world objects offer tactile clues: texture, weight, resistance. Designers can also add lights, vibrations (such as haptic feedback), or sounds to strengthen feedback—and let users confirm actions even if they’re not looking at a screen.
People expect physical objects to behave like real things, where objects should feel solid, stable, and logically constrained. For example, a steering wheel should turn in a way that mimics what drivers would find in a real car, where hard-left or hard-right turns take them to an extreme limit in either direction. A steering wheel that can turn endlessly in either direction can be confusing—or even worry users that it’s disengaged from the steering column and they can’t control the “vehicle.”
Here’s how to start building tangible interfaces from a UX design perspective:
Know your users, what they need and want to do in their contexts, and how they perform tasks and achieve goals. Observe their workflows. Do they rely on physical tools? Work in groups? Use gestures naturally? When you know what they do, why they do it, and how they expect to get things done, you can exceed their expectations and delight them.
Discover what you need to discover about your users and why the contexts they find themselves in help you determine how the best design solutions can take shape for them, as Alan Dix explains:
Choose physical actions that match user goals—such as rotate for change, stack for hierarchy, and group for selection; tap into the analogue “design patterns” that match their mental models about what to do with a dial, a glove, a baseball bat—whatever it is that (safely) helps them do what they need to do.
Learn how to match how users picture things with mental models and how they make sense of their world, in this video with Guthrie Weinschenk: Behavioral Economist & COO, The Team W, Inc., and host of the Human Tech podcast, and author of I Love You, Now Read This Book:
Use cardboard, magnets, blocks in prototyping and test prototypes with real users. Even the simplest, low-fidelity prototypes can convey aspects of functionality, such as a cardboard dial that users can turn on the side of a box. Explore how users respond before you start thinking about adding electronics or developing digital interactivity—how quickly do they understand and use the controls?
Tap into prototyping early in the design process to help get on the right track, as Alan Dix explains in this video:
Add RFID (radio-frequency identification), accelerometers, or cameras to detect object movement so users enjoy high-quality performance that mirrors real-world usage. Low-latency responses build trust. Think of a glove or claw that users can grasp, swipe, wave, punch, flick, squeeze, or karate chop with or rotate to do something digitally; now consider how quickly the effect of each of those actions must appear. A delay of even a split-second can ruin the experience.
Help users predict what physical moves will do—the best designs offer intuitive interaction for seamless experiences. Use color, shape, or placement to hint at meaning, and cue users on what to do. Consider how real-world objects “explain” what users can do with them in the form of affordances.
Understand more about how to empower users with affordances, design elements that match how they understand how to use objects, and open doors to enjoyable experiences faster:
Confirm actions with lights, sound, motion, or haptics. When users notice system responses right away, they can build confidence in your design and avoid errors. For example, if they arrange blocks in a row, a green light, single pulse, and “positive-sounding” chime might tell them they’ve successfully interacted with the design; meanwhile, a red light, double pulse, and “negative-sounding” buzzer might signal they need to try again.
Try systems in real settings with real users—and ensure they can use design solutions comfortably and safely. Plan for wear, storage, and environmental conditions. For example, a controller ball they can squeeze must be durable enough for the sensors to stay responsive after multiple hard squeezes, yet sensitive enough to be able to respond to light touches. As you determine which design features and elements work best, it’s helpful to have two or more versions to try out with users, so you can pinpoint if—for example—a sphere or disk works better for them in their context of use.
Sensors and actuators require coding, calibration, and maintenance—especially if they’re in physical objects that are exposed to repeated “trauma” such as hard taps, bangs, or squeezes. Start with simple setups and scale slowly.
Tangible systems need physical space and real-world care and maintenance. Consider storage, portability, and cleaning.
Too many objects or unclear gestures can overwhelm users, so prioritize simplicity and reduce clutter.
Consider users with different abilities—and remember that users may have more than one disability. Use large tokens, strong contrast, and multimodal feedback to help them get what’s going on.
Fitted with sensors, a smart yoga mat such as this one from YogiFi helps users enjoy many benefits, including gentle guidance through instructions, posture suggestions in real-time, and much more, beautifully and intuitively bringing digital responsiveness into a healthy mindfulness practice.
© Wellnesys Inc., Fair use
Tangible interaction isn’t a novelty—it’s a powerful method to design more natural, intuitive, and inclusive experiences. As such, it’s a forward-looking return to how people already interact with the world and extends digital systems into physical space. As tangible interaction continues to evolve, new tech and materials will enter the everyday vocabulary and experiences of users, particularly in these areas:
Internet of tangible things: Everyday items with embedded sensors become UX input points.
Radical atoms: Materials that change shape or stiffness let designers create morphing interfaces.
Mixed reality: AR overlays give physical tokens digital layers, blending touch with vision.
Ultrahaptics: Mid-air feedback adds tactile responses without physical contact.
Smart environments: Responsive furniture and walls create spatially aware UX canvases.
Overall, the best tangible interactive designs focus on real user needs and leverage sensing and feedback to align well with human behavior. They’re more useful and delightful to users because they’re relatable to human beings at a level that’s more immediate and human than words might be able to describe.
Well-designed products that users can reach for can help the brands that create them achieve a powerful balance between exciting and intuitive, novel and familiar, and ingenious and simple. The term “human-centered design” can take on a higher meaning when tangible UX design lifts experiences up and away from screens. As time ticks on and the “staple” of screen-based design morphs into new areas that involve more possibilities and exciting potential, designers can find powerful reservoirs of inspiration in the timeless human need to reach out and touch.
Welcome yourself to the future of design by discovering powerful insights that can lead to exciting new products; help spark ideas from our course Creativity: Methods to Design Better Products and Services.
Get in touch with tangible interaction at a deep level in our Glossary of Human Interaction entry for tangible interaction.
Explore more insights in the UX Collective article What are tangible user interfaces (TUIs), and how do they enhance UX?
Discover delightful and exciting areas of tangible interaction in the BPI piece Tangible User Interfaces.
Find fascinating information in this IxDA piece, Teaching Tangible Interaction: Beyond the Kit of Parts.
Tangible interaction differs from traditional graphical user interfaces (GUIs) by moving digital interaction off the screen and into the physical world of objects and touch. Instead of tapping buttons on a screen, users interact with real objects—turning, sliding, or touching them to control digital functions. It blends the physical and digital realms seamlessly.
GUIs rely on visual cues and 2D space, while tangible interfaces engage the senses—touch, movement, and even spatial awareness. This can make experiences more intuitive, memorable, and inclusive, especially for children or users with low digital literacy.
Tip: Use tangible interaction when physical context matters—like in learning tools, museum exhibits, or smart home environments—where screen-based UX falls short.
Enjoy exploring exciting design dimensions further in our article No-UI: How to Build Transparent Interaction.
Devices and systems that use tangible interaction include smart toys, interactive museum exhibits, medical training tools, and smart home controls. They’re systems that let users manipulate physical objects to trigger digital responses—such as rotating a dial to control lighting or placing a block to trigger an action on a screen.
Tangible interaction shines in environments where physical engagement improves learning, safety, or intuitiveness and helps reduce cognitive loads. For example, children can learn faster with hands-on tools and surgeons train better using physical models that simulate real-world procedures.
Tip: Look for opportunities where physical context, muscle memory, or spatial awareness can enhance the UX—especially in education, healthcare, or public installations.
Discover how to help users enjoy winning digital experiences that reduce their cognitive load.
Tangible interfaces offer powerful benefits in design by making interactions more intuitive, memorable, and multisensory. They let users manipulate physical objects—like knobs, tiles, or sliders—to control digital systems. This hands-on approach boosts engagement, especially in terms of learning, collaboration, and accessible design.
Tangible UX bridges the gap between thinking and doing. It taps into muscle memory and spatial reasoning, which helps people understand and remember complex concepts. For kids, older adults, or users with limited digital literacy, this can dramatically reduce friction.
Tip: Use tangible interfaces when physical exploration or real-world feedback enhances the experience—such as in education, gaming, or therapeutic tools.
Explore an additional realm of how to design to include more users, in our Master Class How to Design for Neurodiversity: Inclusive Content and UX with Katrin Suetterlin, UX Content Strategist, Architect and Consultant.
Tangible interaction and the Internet of Things (IoT) go hand in hand—literally. While IoT connects everyday objects such as lights and thermostats to the internet, tangible interaction gives users a physical way to control and experience those connections. Instead of tapping a screen, users can twist a knob to dim lights or place an object to trigger an automation.
Together, they create smart, responsive environments where the digital blends seamlessly into daily life. This combo enhances usability in that it makes smart tech more intuitive, especially for non-tech-savvy users.
Tip: Use tangible interaction in IoT when screen-based controls feel clunky—like smart kitchens, interactive classrooms, or healthcare devices.
Explore more about the Internet of Things to help find what the future of design may look and feel like for countless users.
To design for tangible interaction, start by thinking beyond the screen. Focus on how users physically engage with objects—grabbing, rotating, sliding, or stacking. The goal is to make digital functions feel natural through real-world gestures users already understand.
Map out the physical-to-digital connection: What happens when a user moves an object? What feedback do they get? Design clear affordances so users know what they can do with each physical element.
Tip: Prototype early using everyday materials like clay, cardboard, or LEGO. Test how people interact without instructions. If it “just makes sense,” you’re on the right track.
Explore how to prototype well and get more out of prototyping and more successful design solutions that can come from it.
To map physical gestures to digital actions, begin by identifying the most intuitive movements for each task. Think about how people naturally push, pull, rotate, or place objects—and attach those motions to digital outcomes that feel logical and satisfying.
For example, when a user turns a dial to adjust volume, the action mirrors real-world behavior and makes interaction feel seamless. Always provide feedback—visual, auditory, or haptic—so users understand the system’s response immediately.
Tip: Test your gesture-to-action mapping early. Ask users to perform a task without guidance. If they pick the correct gesture instinctively, your mapping likely aligns with their mental model.
Read our article How to Use Mental Models in UX Design to hit the ground running with the concept of how to design for how users make sense of the world.
Tangible interaction and gesture-based interaction both involve physical movement—but they differ in how users engage with the system. Tangible interaction relies on physical objects that users manipulate—like blocks, dials, or sliders—that directly control digital functions. The object becomes part of the interface.
Gesture-based interaction, on the other hand, involves free-hand movements like swiping, waving, or pinching in the air. There's no object involved—it's just the body acting as the controller.
Tip: Use tangible interaction when you want tactile feedback and object-based logic. Choose gesture-based interaction when touchless or spatial control makes more sense—like in AR, VR, or public displays.
One future face of design may involve venturing in a more XR (extended reality) direction; enjoy our Master Class How to Innovate with XR with Michael Nebeling, Associate Professor, University of Michigan for a wealth of insights into this exciting realm.
To ensure accessibility in tangible interaction design, consider a wide range of physical, sensory, and cognitive abilities. Use multi-sensory feedback—like sound, vibration, and lights—to confirm actions. Design objects that are easy to grip, distinguish, and operate with one hand or limited dexterity.
Don’t rely on a single sense. For example, don’t depend only on visual cues—support them with tactile or auditory signals. Arrange to have designs tested with users who have diverse needs, including motor impairments or low vision (specialist agencies are best to ensure users with disabilities can enjoy using a product).
Tip: Label objects with both texture and contrast. Use universal design principles to make tangible interfaces usable by as many people as possible—right from the start.
Explore extensive aspects of accessibility in Accessibility: Usability for all and come away with a greater sense of how to include users with disabilities and help all users in the process.
Tangible interfaces shape user mental models by grounding digital interactions in the physical world. When users twist a knob or slide a tile, they build a clearer, more intuitive understanding of how the system works—because it mirrors real-world cause-and-effect relationships. These hands-on actions create direct, memorable links between gesture and outcome and slash reliance on memory.
Unlike abstract GUIs, tangible interfaces use physical cues—like weight, shape, or movement—that help users predict results and feel confident navigating the system.
Tip: Align physical actions with real-world logic. If turning right increases volume, make sure the system responds that way. This strengthens users’ mental models and boosts usability.
For intuitive design tips and much more, get right into the heart of the matter in our Master Class How to Design with the Mind in Mind with Jeff Johnson, Assistant Professor, Computer Science Department, University of San Francisco.
Haptics play a key role in tangible interaction by providing physical feedback that reinforces user actions. When users feel a click, vibration, or resistance while they’re turning a knob or pressing a button, it confirms the system registered their input. The tactile response boosts confidence, reduces errors, and strengthens the mental connection between action and outcome.
Haptics also make interactions more engaging and accessible—especially for users who rely on touch rather than sight or sound.
Tip: Design haptic feedback to match the intent—light vibration for a soft action, firm resistance for critical input. Use texture and motion to guide users intuitively and keep them on the right track.
Find out how to help users experience what great design feels like through haptic interfaces.
It comes with unique challenges—designers must balance physical form, interaction logic, and digital feedback. Unlike GUIs, tangible interfaces require thoughtful consideration of materials, ergonomics, and spatial layout. Prototyping is harder, too—you can’t quickly iterate a physical object like you can a screen.
Another challenge is to ensure the interface is intuitive without visible instructions. Users must feel what to do. Plus, you need to handle wear, maintenance, and real-world environments—like noise, lighting, or motion—that can affect sensors and usability.
Tip: Prototype early with low-fidelity materials. Test in real contexts, and involve users from diverse backgrounds to uncover physical and usability blind spots. The more you can find out earlier, the more you can save later.
Feel out the edges of successful design earlier with low-fidelity prototypes through our article 5 Common Low-Fidelity Prototypes and Their Best Practices.
To test tangible interfaces with users, observe how they interact with the physical object and the digital response—without prompting. The goal is to see if they intuitively understand what the object does. Can they guess the function? Do they get feedback when they act? Let them get on with it.
Use low-fidelity prototypes early—cardboard prototypes, 3D prints, or existing toys—to test concepts. Focus on how users grasp, manipulate, and interpret objects. Ask them to verbalize their thoughts (“think aloud”) and watch wherever they become confused, hesitant, or surprised.
Tip: Test in real-world settings, not just labs. Context—like lighting, sound, or body posture—matters more for tangible UX than screen-based design.
A treasure trove of insights into how to do effective usability testing for more successful products awaits you in our Master Class How to Get Started with Usability Testing with Cory Lebson: Principal User Experience researcher with 20+ years experience and author of The UX Careers Handbook.
Ishii, H., & Ullmer, B. (1997). Tangible bits: Towards seamless interfaces between people, bits and atoms. Proceedings of the ACM SIGCHI Conference on Human Factors in Computing Systems, 234–241.
This landmark paper introduces the "Tangible Bits" vision, aiming to bridge the gap between digital and physical worlds through tangible user interfaces. Ishii and Ullmer propose interfaces where users interact with digital information by manipulating physical objects. This article is foundational for UX designers seeking to create seamless, embodied, and intuitive interactions. The authors provide practical examples—like metaDESK and ambientROOM—that remain influential in prototyping and designing physical-digital hybrid systems. It’s widely cited due to its visionary yet applicable insights into interface design, shaping decades of work on tangible computing.
Wang, S. J., Moriarty, P., & Wu, S.Z. (2015). Tangible interaction design: Preparing future designers for the needs of industrial innovation. Procedia Technology, 20, 162–169.
Wang, Moriarty, and Wu (2015) present the Tangible Interaction Design Education (TIDE) model to reshape industrial design training. They argue that as products increasingly blend physical and interactive elements, designers need stronger creativity that integrates both tangible form and digital behavior. TIDE, piloted at Monash University, bridges traditional industrial design and interaction design through studio-based, hands-on learning rooted in real-world applications. The curriculum emphasizes usability, physical interactivity, and user-centered strategies, preparing graduates for innovation-driven industries. This article is valuable for UX educators and curriculum designers aiming to cultivate multidisciplinary competencies essential for creating information-rich, interactive products.
Heradio, R., de la Torre, L., Galan, D., Cabrerizo, F. J., Herrera-Viedma, E., & Dormido, S. (2021). Development and technical design of tangible user interfaces: A review. Sensors, 21(13), Article 4258. https://doi.org/10.3390/s21134258
This review article offers an up-to-date, comprehensive overview of tangible user interfaces from a technical and design-oriented perspective. It surveys TUI implementations across domains like education, healthcare, and art, with a special focus on sensor integration, feedback mechanisms, and physical device construction. For UX designers and engineers, it provides practical insights on component selection (RFID, Arduino, etc.) and system architecture. The paper’s utility lies in its digestible, application-driven discussion of how to implement TUIs in real-world scenarios, making it a valuable reference for prototyping and development teams.
André, B. (2016). Tangible Interactive Systems: Grasping the Real World with Computers. Springer Cham.
This book differentiates tangible user interfaces (TUIs) from broader tangible interactive systems (TISs) and offers tools for designing complex, real-world systems. It explores models like SFAC and multipleV, focusing on humancentered design, creativity, sustainability, and formative evaluation—bridging HCI and industrial design
Maher, M. L., & Lee, L. (2017). Designing for Gesture and Tangible Interaction. Morgan & Claypool.
Targeted at embodied interaction (postWIMP), this concise volume addresses design issues for both tangible and gesture-based systems. It reframes traditional input modes into tangible keyboards and models, includes cognitive design principles, and identifies research challenges.
Remember, the more you learn about design, the more you make yourself valuable.
Improve your UX / UI Design skills and grow your career! Join IxDF now!
You earned your gift with a perfect score! Let us send it to you.
We've emailed your gift to name@email.com.
Improve your UX / UI Design skills and grow your career! Join IxDF now!
Here's the entire UX literature on Tangible Interaction by the Interaction Design Foundation, collated in one place:
Take a deep dive into Tangible Interaction with our course Creativity: Methods to Design Better Products and Services .
The overall goal of this course is to help you design better products, services and experiences by helping you and your team develop innovative and useful solutions. You’ll learn a human-focused, creative design process.
We’re going to show you what creativity is as well as a wealth of ideation methods―both for generating new ideas and for developing your ideas further. You’ll learn skills and step-by-step methods you can use throughout the entire creative process. We’ll supply you with lots of templates and guides so by the end of the course you’ll have lots of hands-on methods you can use for your and your team’s ideation sessions. You’re also going to learn how to plan and time-manage a creative process effectively.
Most of us need to be creative in our work regardless of if we design user interfaces, write content for a website, work out appropriate workflows for an organization or program new algorithms for system backend. However, we all get those times when the creative step, which we so desperately need, simply does not come. That can seem scary—but trust us when we say that anyone can learn how to be creative on demand. This course will teach you ways to break the impasse of the empty page. We'll teach you methods which will help you find novel and useful solutions to a particular problem, be it in interaction design, graphics, code or something completely different. It’s not a magic creativity machine, but when you learn to put yourself in this creative mental state, new and exciting things will happen.
In the “Build Your Portfolio: Ideation Project”, you’ll find a series of practical exercises which together form a complete ideation project so you can get your hands dirty right away. If you want to complete these optional exercises, you will get hands-on experience with the methods you learn and in the process you’ll create a case study for your portfolio which you can show your future employer or freelance customers.
Your instructor is Alan Dix. He’s a creativity expert, professor and co-author of the most popular and impactful textbook in the field of Human-Computer Interaction. Alan has worked with creativity for the last 30+ years, and he’ll teach you his favorite techniques as well as show you how to make room for creativity in your everyday work and life.
You earn a verifiable and industry-trusted Course Certificate once you’ve completed the course. You can highlight it on your resume, your LinkedIn profile or your website.
We believe in Open Access and the democratization of knowledge. Unfortunately, world-class educational materials such as this page are normally hidden behind paywalls or in expensive textbooks.
If you want this to change, , link to us, or join us to help us democratize design knowledge!