Im Rahmen des Förderprogramms des Ministeriums für Wissenschaft, Forschung und Kunst Baden-Württemberg sollen innovative Lehr-/ Lernszenarien sowie neue Ansätze zur Nutzung von Virtual Reality (VR) und/ oder Augmented Reality (AR) für die Lehre an Hochschulen anwendungsorientiert erforscht und nutzbar gemacht werden. Anträge können noch bis 31. Juli 2019 eingereicht werden.
In den letzten Jahren konnten in dem Bereich Virtual Reality (VR) und Augmented Reality (AR) große technische Fortschritte erzielt werden. Dies hat dazu beigetragen, dass sich diese Technologien vom „Spielzeug“ zum „Werkzeug“ entwickelt haben und zunehmend Verwendung finden. Dies verwundert nicht, denn kein anderes digitales Medium bietet momentan die Möglichkeit, sich so direkt und intensiv – sozusagen mit dem ganzen Körper – mit einem Gegenstand auseinanderzusetzen, wie es bei VR / AR der Fall ist. Insbesondere wird ihnen ein hohes Potential zur Verbesserung der Lehre und des Lernens sowie zur Realisierung neuartiger Lehr-/Lernszenarien zugesprochen.
Das Gesamtkonzept zur Förderung der Digitalisierung in der Hochschullehre beinhaltet das vier Bausteine umfassende Maßnahmenpaket „Teaching4Future digital@bw“. Mit dem vierten Baustein „Teaching4Future with virtual elements digital@bw” (T4F-virtual) sollen nun Vorhaben gefördert werden, die die Potenziale von Virtual Reality (VR) und Augmented Reality (AR) für die Bildung adressieren. Insgesamt stehen für die Laufzeit von bis zu 36 Monaten 1,6 Mio. Euro zur Verfügung (maximal 400.000 Euro je Vorhaben). Die Antragsfrist endet am 31. Juli 2019.
Mehr dazu finden Sie auf der Webseite des Ministeriums für Wissenschaft, Forschung und Kunst Baden-Württemberg.
After winning the 2015 FIFA Women’s World Cup, the USA Women’s National Soccer Team is ready to defend its title in the 2019 tournament.
To capitalize on the excitement, USA Today has published a pair of augmented reality experiences that give readers the opportunity to not only learn more about the players on the world’s top-ranked women’s team, but also try their hand at playing goalkeeper in a virtual shootout.
With the USA Today app for iOS or Android installed, readers can try the experience by navigating to „Meet the Team“ and „Make the Save“ sections displayed at the top of the app’s screen in blue boxes on their AR-capable mobile devices, or jump to the sports section of the app.
Sponsored by Fox Sports, the Meet the Team experience lays out a virtual football pitch (or soccer field) in the user’s camera view. Users can scroll through the roster and select players to learn more about them via infographics and narration. World Cup standings and schedule are also displayed in the user’s physical space.
„We wanted to provide an immersive way to learn more about the US Women’s team throughout the lifecycle of this year’s World Cup,“ said Ray Soto, director of emerging technology at USA Today, in a statement. „We have seen how much AR informs and entertains users, so it’s a natural fit for brands to want to be a part of these engaging experiences as well. And, keeping consumers engaged throughout the World Cup experience really builds excitement.“
The Make the Save experience, on the other hand, focuses on Team USA goalkeeper Alyssa Naeher. After learning more about Naeher’s record and her approach to her position, users can participate in a simulated goalkeeper mini-game. Then, after stepping into position in front of a virtual goal, users are challenged to defend against penalty kicks from a digital opponent, with difficulty escalating from 20 miles per hour to a professional-level 80 mph.
For this latest experience, USA Today used a photogrammetry approach, similar to its AR feature on Oscar-nominated costumes, in order to realistically capture Naeher’s appearance for the app interaction.
About this time last year, Facebook and Snapchat juggled several augmented reality experiences devoted to the Men’s World Cup. This year, USA Today has served up an objectively more in-depth and informative experience for Team USA as the Women’s World Cup steps into the spotlight.
Moreover, USA Today continues its general run of publishing insightful and immersive AR experiences, including an exploration of the historical Notre Dame Cathedral and the world’s tallest skyscraper.
Mit drei neuen Zentren in München, Würzburg und Nürnberg will der Freistaat Bayern die Entwicklung von sogenannter Virtual Reality und Augmented Reality voranbringen. Damit werden verschiedene Darstellungen von computergenerierten künstlich erschaffenen Welten beschrieben. Im aktuellen Doppelhaushalt stünden für diese Technologien 1,5 Millionen Euro zur Verfügung, sagte Digitalministerin Judith Gerlach (CSU) am Donnerstag im Wirtschaftsausschuss des bayerischen Landtags. Ziel sei es, neue Anwendungen in der Bildung, in Museen, im Gesundheitswesen und in der Industrie möglich zu machen. „Virtual Reality und Augmented Reality ,Made in Bavaria’ soll national und international eine Marke werden“, sagte Gerlach.
Bis September soll in München das Anwenderzentrum „XR Hub“ angesiedelt werden – es wird jährlich mit 500000 Euro unterstützt, im März 2020 folgen die regionalen Zentren für Unter- und Mittelfranken. Beide erhalten für das Jahr ebenfalls in Summe eine halbe Million Euro. „Allerdings benötigen wir hierfür eine dauerhafte Finanzierung und bitten dabei auch um Ihre Unterstützung“, sagte Gerlach an die Abgeordneten gerichtet.
Darüber hinaus gab sie bekannt, dass sich 300 Frauen am ersten Talentförderprogramm für mehr Frauen in Digitalberufen beworben hätten. Jährlich sollen 50 junge Frauen zwischen 18 und 30 Jahren daran teilnehmen. Ihnen stünden Paten aus der Branche zur Seite – darunter Sabine Bendiek, Chefin von Microsoft Deutschland. (dpa)
Foto: Foto: JUSTIN SULLIVAN / AFP
Online multiplayer could revolutionize how we interact in AR.
While VR makes you forget about the outside world, AR instead enhances reality by blending virtual elements with your real-world environment. It unites both realities in the most seamless and exciting way possible, allowing users to access one-of-a-kind immersive experiences using existing smartphone technology. You’re almost certainly aware of augmented reality and its application in games such as Pokemon Go!, Google Ingress, and The Walking Dead: Our World, titles which—over the past three years—have contributed heavily to the rise of location-based AR gaming. Almost halfway into 2019, we’re beginning to see the growth of a new feature in AR-based mobile gaming that further impact how we interact in AR: remote multiplayer functionality.
Gaming, business, communication, entertainment – remote multiplayer AR has the potential to breathe new life into a multitude of activities. Imagine a father away from home on a business trip being able to build an AR lego spaceship with his son, who is miles away at home; or a distributed team of engineers working on the schematics for an actual spaceship in an augmented meeting room. Two friends separated towns apart could even battle as virtual cyber raccoons in a deathmatch-style competition. The challenge with remote multiplayer AR is sharing the same AR experience within two different physical environments simultaneously in real-time. Thanks to continued advancements to both Google’s ARCore and Apple’s ARKit platforms, however, users with compatible smartphone devices could be seeing a lot more AR experiences with remote multiplayer AR functionality in the near future.
The introduction of the remote online multiuser feature is what makes AR so fascinating to experience and so complex to develop. The technical side is a bit more challenging than in VR where online multiplayer is nothing new. Indeed, in VR there is no such difficulty with tracking the surrounding space: all the scenes are predefined which makes VR multiplayer very similar to conventional PC and mobile-based multiplayer.
In AR you have to build the scene, taking into account the parameters of each player’s physical environment (one playing on a football field, another — in a living room). This can make it difficult to match users, as you must first make sure their scenes can be built and merged correctly to allow for accurate interactions. The more users you have playing with each other simultaneously, the more complex it gets.
Despite these inherent difficulties, more developers than ever have begun experimenting with the implementation of this functionality in their personal applications. In terms of universal compatibility between devices, WATTY Technology is leading the way with their WATTY Remote / WATTY Ghost Mode technology. This London-based multiuser AR technology company has already announced its upcoming AR-based ping-pong application, AR Ping Pong—in which players from around the globe compete with one another in real-time table tennis matches— as well as Dance Battle, a music-based experience that has players duking it out for the honor of “Best Twerk.”
If you want to get into early access, leave your email below in the comments to receive an invitation.
Vlad Vodolazov, Co-Founder and CTO of WATTY explains why ping-pong is the best way to demonstrate the potential and interactivity of remote online multiuser AR: “Table tennis is a fast-paced game and it brings out different sides of this technology like matchmaking and lag compensation techniques. From a gaming perspective, this is a timeless classic reimagined as an immersive uninterrupted AR experience — anyone can recall playing ping-pong at some point of their lives”.
With its AR Ping-Pong, Watty gives a fiery start to the remote online multiuser concept. The company is working on business cases to make AR more useful and fun, aiming at turning the mundane and routine into entertainment. “Bringing remote online multiuser AR experiences to daily activities such as grocery shopping, sports, and social interactions will certainly improve the way we live and work today”, states Vlad Vodolazov.
I for one can’t wait for the day there’s a remote AR app that finally turns the chore of shopping into a thrilling multiplayer experience.
Image Credit: WATTY Technology
At the company’s annual WWDC developer conference today, Apple revealed ARKit 3, its latest set of developer tools for creating AR applications on iOS. ARKit 3 now offers real-time body tracking of people in the scene as well as occlusion, allowing AR objects to be convincingly placed in front of and behind those people. Apple also introduced Reality Composer and RealityKit to make it easier for developers to build augmented reality apps.
Today during the opening keynote of WWDC in San Jose, Apple revealed ARKit 3. First introduced in 2017, ARKit is a suite of tools for building AR applications on iOS.
From the beginning, ARKit has offered computer vision tracking which allows modern iOS devices to track their location in space, as well as detect flat planes like the ground or a flat table which could be used to place virtual objects into the scene. With ARKit 3, the system now supports motion capture and occlusion of people.
Human Occlusion & Body Tracking
Using computer vision, ARKit 3 understands the position of people in the scene. Knowing where the person is allows the system to correctly composite virtual objects with regard to real people in the scene, rendering those objects in front of or behind the person depending upon which is closer to the camera. In prior versions of ARKit, virtual objects would always show ‘on top’ of anyone in the scene, no matter how close they were to the camera. This would break the illusion of augmented reality by showing conflicting depth cues.
Similar tech is used for real-time body tracking in ARKit 3. By knowing where people are in the scene and how their body is moving, ARKit 3 tracks a virtual version of that person’s body which can in turn be used as input for the AR app. Body tracking could be used to translate a person’s movements into the animation of an avatar, or for interacting with objects in the scene, etc.
From the footage Apple showed of their body tracking tech, it looks pretty coarse at this stage. Even with minor camera movement the avatar’s feet don’t remain particularly still while the rest of the body is moving, and small leg motions aren’t well tracked. When waving, the avatar can be seen to tip forward in response to the motion even though the user doesn’t. In the demo footage, the user keeps their arms completely out to the sides and never moves them across their torso (which would present a more challenging motion capture scenario).
For now this could surely be useful for something simple like an app which lets kids puppeteer characters and record a story with AR avatars. But hopefully we’ll see it improve over time and become more accurate to enable more uses. It’s likely that this was a simple ‘hello world’ sort of demo using raw tracking information; a more complex avatar rig could smartly incorporate both motion input and physics to create a more realistic, procedurally generated animation.
Both human occlusion and body tracking will be important for the future of AR, especially with head-worn devices which will be ‘always on’ and need to constantly deal with occlusions to remain immersive throughout the day. This is an active area of R&D for many companies, and Apple is very likely deploying these features now to continue honing them before the expected debut of their upcoming AR headset.
Apple didn’t go into detail but listed a handful of other improvements in ARKit 3:
- Simultaneous front and back camera
- Motion capture
- Faster reference image loading
- Auto-detect image size
- Visual coherence
- More robust 3D object detection
- People occlusion
- Video recording in AR Quick Look
- Apple Pay in AR Quick Look
- Multiple-face tracking
- Collaborative session
- Audio support in AR Quick Look
- Detect upt to 100 images
- HDR environment textures
- Multiple-model support in AR Quick Look
- AR Coaching UI
With ARKit 3, Apple also introduced RealityKit which is designed to make it easier for developers to build augmented reality apps on iOS.
Building AR apps requires a strong understanding of 3D app development, tools, and workflows—something that a big portion of iOS developers (who are usually building ‘flat’ apps) aren’t likely to have much experience with. This makes it less likely for developers to jump into something new like AR, and Apple is clearly trying to help smooth that transition.
From Apple’s description, RealityKit almost sounds like a miniature game engine, including “photo-realistic rendering, camera effects, animations, physics and more.” Rather than asking iOS developers to learn game engine tools like Unity or Unreal Engine, it seems that RealityKit will be an option that Apple hopes will be easier and more familiar to its developers.
With RealityKit, Apple is also promising top notch rendering. While we doubt it’ll qualify as “photo-realistic,” the company is tuning rendering the allow virtual objects to blend as convincingly as possible into the real world through the camera of an iOS device by layering effects onto virtual objects as if they were really captured through the camera.
“RealityKit seamlessly blends virtual content with the real world using realistic physically-based materials, environment reflections, grounding shadows, camera noise, motion blur, and more, making virtual content nearly indistinguishable from reality,” Apple writes.
RealityKit, which uses a Swift API, also supports the creation of shared AR experiences on iOS by offering a network solution out of the box.
Just like RealityKit, Reality Composer aims to make things easier for developers not experienced with game engines, workflows, and assets. Apple says that Reality Composer offers a library of existing 3D models and animations with drag and drop ease, allowing the creation of simple AR experiences that can integrate into apps using Xcode or be exported to AR Quick Look (which allows built-in iOS apps like Safari, Messages, Mail, and more, to quickly visualize 3D objects at scale using augmented reality).
In addition to the built in object library, Reality Composer also allows importing 3D files in the USDZ format, and offers a spatial audio solution.
Immersive technologies can help students understand theoretical concepts more easily, prepare them for careers through simulated experiences and keep them engaged in learning.
VR helps astronomy students at San Diego State University understand concepts that are hard to explain verbally.
Immersive reality is bumping us into the deep end, virtually speaking. Colleges and universities large and small are launching new labs and centers dedicated to research on the topics of augmented reality, virtual reality and 360-degree imaging. The first academic conference held completely in virtual reality recently returned for its second year, hosted on Twitch by Lethbridge College in Alberta and Centennial College in Toronto. Majors in VR and AR have begun popping up in higher education across the United States, including programs at the Savannah School of Design (GA), Shenandoah University (VA) and Drexel University Westphal (PA). Educause experts have most recently positioned the timing for broad adoption of these technologies in education at the two-year to three-year horizon. And Gartner has predicted that by the year 2021, 60 percent of higher education institutions in the United States will „intentionally“ be using VR to create simulations and put students into immersive environments.
If you haven’t already acquired your own headset or applied for a grant from your institution to test out AR or VR for instruction, it’s time. We’ve done a scan of some of the most interesting projects currently taking place in American classrooms to help you imagine the virtual possibilities.
1) Grasping Concepts
San Diego State University’s Instructional Technology Services unit launched a Virtual Immersive Teaching and Learning (VITaL) initiative in 2017. Since then, dozens of faculty members have tested the use of AR, VR, mixed reality and 360-degree-video tools for use in numerous disciplines. Instructors can check out gear to immerse their students in new learning environments in their chosen spaces, or take over the VITaL Learning Research Studio to accommodate up to 40 students at four different stations.
Among the many types of experimentation, Gur Windmiller, an instructor in the astronomy department, has found VR „a perfect fit“ for teaching students about astronomy. Concepts that can be hard to understand with verbal explanations suddenly make sense when students experience them visually, he explained in a university article. „It’s a very visual subject,“ he said. „You can create these sandbox worlds where students can just play around with astronomical objects and see what happens.“
2) Recreating Past Experiences for New Learners
The issues of truth and media manipulation brought students from Culture & Media, Journalism & Design and Theatre together at The New School (NY) in a class co-taught by liberal arts instructor Sara Montague and XReality Centerhead Maya Georgieva. Their task: to study — and recreate — that pioneer of „fake news,“ the War of the Worlds radio broadcast, coinciding with its 80th anniversary. Students performed and filmed key scenes from a recreation of the event.
The XReality Center, which allows students and faculty to check out XR products and puts on workshops to help them learn how to use the technology, has also been behind initiatives to digitize nearly the entirety of the University Center campus in New York City, study classic pieces of fashion and capture stories of survivors of sexual assault and their journeys to healing.
3) Stagecraft for Theater Students
A project at Maine’s Husson University will help theater people visualize stage design for their productions. The university’s integrated technology department is developing AR Stagecraft, an app for iPhones and iPads that provides an immersive experience on an empty stage. Students in Husson’s entertainment production program are currently designing theater sets in a computer aided drafting class, which will be imported into the app to provide users the experience of walking through a set on stage before construction begins. This is just the first of many AR and VR projects to come, promised the institution, culminating in construction of an Interactive Experience (IEX) Center, which will be part of a new College of Business building expected to open in 2021.
4) Virtual Reconstruction of History
Since 2005, students and faculty from the University of Denver (CO) Anthropology Department have worked with members of the public to research, interpret and preserve Amache, a World War II Japanese-American internment camp about four hours south of Denver. While the project has long digitized objects linked to the site to make the work accessible to those who can’t visit physically, now a team is using drone image capture to produce a 3D reconstruction of the camp. Eventually, a composite of those photos will be used to feed a VR app that will allow viewers to move through the site via a headset, and an AR app that will let users hold up their devices and see what was there during the camp’s operation.
The Amache project is just one of many university-led projects to bring important sites back to life, particularly those tied to people underrepresented in history. Members of several departments at the University of Arkansas, including Anthropology and Humanities, are leading a similar project to enable people to experience the Spiro Mounds, a gathering place for a large community of Spiroans, a prehistoric culture that inhabited the site between AD 900 and 1300.
„While much of the Spiro culture is still a mystery, the 3D immersive environment will give participants the chance to see what a ceremony may have been like, in a very immediate and interactive way,“ said David Fredrick, director of the Tesseract Center, a game development and visualization studio housed at the university. „Creating this experience allows us to bring the past into the present in a very tangible and meaningful way.“
And the libraries at the University of Nevada, Reno are working with that institution’s Anthropology department to create a „virtual museum,“ which involves scanning and photographing the department’s collection of Native American baskets. „The idea here is to use VR technology to allow library users and members of the community to have access to an impressive and expansive collection of exquisite baskets,“ said Multimedia Production Specialist Michelle Rebaleati during a TEDxUniversityofNevada talk. „In the VR museum, visitors can pick up, inspect and look closely at the items contained in the collection. Because the actual baskets are delicate and can’t be handled, we are delivering the collection to users in a new, revolutionary way.“
5) Going on Space Walks
During a space walk, an astronaut’s job involves communication — strictly via voice — with mission control and others inside and outside the spacecraft. Now NASA is tapping ideas from higher ed to develop innovative helmet-based displays that use Microsoft HoloLens to provide instructions through the augmented reality display environment. Numerous student teams from institutions throughout the land are participating in this latest NASA SUITS („Spacesuit User Interface Technologies for Students“) challenge. A set of finalist teams recently headed to Johnson Space Center to test their designs, including students from the University of Baltimore’s (MD) Digital Whimsy Lab, the University of Akron (OH), the University of Northern Texas, Florida’s University of Miami, the University of Colorado Boulder, Virginia Tech and Boise State University (ID).
The software from Boise, for example, would help ground control send new procedures during an extravehicular activity and provide displays of images, 3D paths and instructions on how to safely navigate outside the International Space Station. In true student fashion, that team’s software, „ARSIS“ (for „Augmented Reality Space Informatics System“), includes a voice recognition system named Adele (for mathematician and programmer Adele Goldstein, who wrote the documentation for ENIAC), which, when a warning goes off, breaks out into „Skyfall“ by today’s Adele.
6) Reimagining the Future
Centuries before uranium mines in the area became an environmental hazard, Church Rock, NM, was the home to the Diné, Navajo people. Now, waste cleanup efforts are driving the families of one Diné community to seek permanent relocation to another part of their ancestral lands, a mesa without access to electricity, running water or paved roads. To help the residents of the Diné Red Water Pond Road Community envision what their new settlement could look like, faculty and students from the University of New Mexico’s schools of Architecture and Engineering are using VR and 3D renderings to create potential scenarios to „make the design tangible and immerse the community into the plan so that they can see that their future on the mesa is within reach,“ according to an article on the project.
Catherine Harris, assistant professor of landscape architecture and art and ecology, said the various digital tours of alternate designs will help residents of the community understand the tradeoffs they’ll be making along the way in their off-the-grid move. For instance, more water collection requires more built space, conventional waste treatment eliminates the possibility for methane harvest, and so on. „Ultimately,“ Harris said, „this project is for the Diné. It gives agency to the community so they can realize a sustainable and thriving future for all generations.“
7) Practicing Clinical Care
Students in Western Carolina University’s (NC) School of Nursing have tested the use of VR for experiential learning in emergencies. The idea is to help community nurses gain exposure to clinical situations that might not occur often enough in real life, to help them become comfortable with their responses. For an initial testing of the setup, participants donned VR headsets to enter a scene in which a patient has come into a clinic experiencing an allergic reaction to medication that is worsening, including difficulty with breathing. The students had to make rapid decisions about the patient’s care and prioritize their responses at numerous points during the simulation.
„Using virtual reality in clinical education allows learners to be involved in real-life experiences without real-life consequences,“ said Elaine Alexander, director of a regional simulation center that brings educational opportunities to outlying areas, in a university story.
8) Hands-on Railroading
An immersive program at Pennsylvania State University, Altoona is helping prepare the newest generation of railroaders. While students — future train engineers — still go out into the field to get practice with railroad infrastructure, railcars and locomotives, they also use an industry-grade locomotive simulator that includes a virtual welder, to let them try out track work and welding.
„We can build an entire railroad in the simulation and receive diagnostics from the railroad itself to see how it’s running,“ said student Michael Yohn, in university reporting. „It’s a really cool piece of equipment.“
Last year, students in the Rail Transportation Engineering (RTE) program headed to Europe to record 360-degree videos at various railroad locations in Germany, Switzerland and Austria under the guidance of instructor Bryan Schlake. They also collected videos of the trains, railroads, yards, terminals, signals and footage from the operator’s cabs on several trains. Instructional Designer Joe Scott helped the team produce 360-degree video content showing various European railway applications, now made available on the RTE YouTube website.
9) Feeling the Impact of Decisions
A course in the Executive MBA program at Fordham University’s (NY) Gabelli School of Business puts its future leaders through two VR exercises intended to help them understand the power of communication and teamwork. Led by Julita Haber, director of the full-time cohort MBA and a communications and media professor, one activity requires students to walk across a balance beam at the top of a 1,400-foot skyscraper, urged on by team members. Another one tasks student teams to select one person to deactivate a bomb while receiving directions from the others.
Observed one student, William Allan, a financial professional at a global business and tech consultancy, „Each exercise provided me with a different framework for my thought process in how to approach a situation. It [showed] me the importance of delegation and teamwork in time-sensitive situations.“
„The topics of team dynamics and technology go very well together,“ said Haber in a statement. „I wanted to create activities that would evoke emotions and enhance their communication skills to reach a goal.“
Many of us know that you can make a few bucks from Amazon by helping the company sell its wide array of products, but now there’s a very different way to make a buck with the company, and it involves 3D technology.
The „Amazon Real World Image Study“ is a new, limited time program based in New York City that asks the public to submit themselves to full body 3D scanning for Amazon’s research and get paid in the process.
The Body Labs team, a New York-based startup acquired by Amazon in 2017, is conducting the program. The company’s work revolves around artificial intelligence, computer vision, and body modeling.
Back when the company was independent, it described a process of capturing a photographic image of a person and then transforming that image into a 3D model of the person. That 3D data could then, based one part of the company’s vision, be used for things such as fitting clothing to a person’s body shape.
That vision fits in perfectly with Amazon’s goal of allowing online shoppers to more accurately purchase items that fit by using augmented reality in the same way that the company has begun using AR to allow you to place items in your home before clicking the buy button.
And beyond the realm of smartphone apps, there’s also a possibility that this research program will also help to train Amazon’s Echo Look device.
The $99 device, which was released in 2018, helps users determine whether or not a particular outfit looks good by combining the device’s depth camera with machine learning (along with advice from fashion experts) via the Style Check feature on the Echo Look.
In return for submitting yourself to Amazon’s body scan, you’ll get a $25 Amazon gift card. The process takes around 30 minutes and will require those involved to change into form-fitting clothing provided by Amazon.
The location is in one of the tourist centers of New York (Union Square at 37 East 18th street), so the crowds looking to snag a bit of free Amazon cash are likely to be massive. But if you’re interested, time is wasting, as the research program ends in June.
Reach EDU looks to expand science, tech, engineering, arts, and mathematics through AR robots.
If you’re not familiar with Reach Robotics line-up of kid-friendly robots, MekaMon are programmable machines which players can control via a smart device to battle digital opponents—both real and AI—in an augmented space; adding an additional layer of visual excitement to the conventional playtime experience. Now the company is adding even more educational value in the form of STEAM-based learning with Reach EDU.
Learning through play is a key part of the Reach EDU experience. Originally announced at CES 2019, the Reach EDU app has officially launched and now works alongside the existing MekaMon augmented reality gaming app, bringing together creative STEAM learning and advanced robotics in one approachable package.
The World Economic Forum recently listed problem-solving, critical thinking, creativity as the top three skills that children need for success. ReachEDU Missions have been created using gamification techniques to reward and challenge students in ways that encourage creative problem solving and deeper engagement with coding concepts.
The Reach EDU app incorporates engaging storytelling to guide learning. Led by Ivy, the Head Engineer at the Mekacademy, students learn to code their MekaMon in a series of game-like challenges in preparation for a mission to Mars.
The app is structured around four core features which can be used independently of the missions to encourage creative play and experimentation:
- Free Drive: Freely to experiment with MekaMon’s fluid movement and lifelike animations.
- MekaDraw: Trace a line across the screen and MekaMon will follow.
- MekaMotion: Code directly on the MekaMon robot by moving each of its limbs to build up a series of commands through stop motion animation.
- MekaCode: Code direct commands in Scratch-based block coding.
Additionally, Reach EDU is designed with a range of learners and experience levels in mind.
For those just getting started Draw, MekaMotion and MekaCode help get familiar with coding concepts. Experienced learners can unlock a world of new programming potential with further applications such as Swift Playgrounds.
For students further along in their studies, including those at degree and postgraduate level, third-party educational platforms are being developed to work MekaMon in addition to existing Swift Playground integration.
These will include a Reach Raspberry Pi processing module and a browser version of Reach EDU incorporating Python to support KS3 and 4 curriculums and open the platform up to advanced experimentation.
Reach EDU is the realization of the primary inspiration behind MekaMon. Reach Robotics CEO, Silas Adekunle, created the first prototype while teaching in schools. He quickly learned that robotics and gaming captured students’ attention, making learning infinitely more engaging.
“There’s a huge amount of creative potential with MekaMon, due to the scope of its expressive movement and personality. Reach EDU is about delivering the tools to take advantage of this, by creating a versatile, accessible, and fun platform for effective STEAM education and ongoing innovation”. – Silas Adekunle, CEO Reach Robotics.
Reach Robotics is exploring how their AR advances with MekaMon will evolve the Reach EDU experience. Reach EDU will have ongoing updates with new missions, content and supporting materials continually being developed.
The ReachEDU app is available for free on the Appstore and Google Play.MekaMon are available via Apple (in-store and online), Amazon among other retailers. Schools can order multiple units at a discount via email@example.com.
According to Change the Equation, the number of STEM jobs will grow 13 percent, compared to 9 percent for non-STEM jobs between 2017 and 2027— computing, engineering, and advanced manufacturing are leading the way.
With platforms like Reach EDU, our present-day learners will be prepared to be future leaders through the power creativity, problem-solving, advanced robotics and fun learning experiences, advanced robotics all around.