• Skip to primary navigation
  • Skip to main content
  • Skip to primary sidebar
  • Skip to footer

3D CAD World

Over 50,000 3D CAD Tips & Tutorials. 3D CAD News by applications and CAD industry news.

  • 3D CAD Package Tips
    • Alibre
    • Autodesk
    • Catia
    • Creo
    • Inventor
    • Onshape
    • Pro/Engineer
    • Siemens PLM
    • SolidWorks
    • SpaceClaim
  • CAD Hardware
  • CAD Industry News
    • Company News
      • Autodesk News
      • Catia News & Events
      • PTC News
      • Siemens PLM & Events
      • SolidWorks News & Events
      • SpaceClaim News
    • Rapid Prototyping
    • Simulation Software
  • Prototype Parts
  • User Forums
    • MCAD Central
    • 3D CAD Forums
    • Engineering Exchange
  • CAD Resources
    • 3D CAD Models
  • Videos

VR software

Immersive Design—A virtual reality case study

February 16, 2022 By Leslie Langnau Leave a Comment

A new Adidas maker space—located inside a giant, digital sneaker—features virtual-reality 3D design tools for long-distance design collaboration.

Jean Thilmany, Senior CAD Editor

When Adidas designers were tasked with creating a seamless sneaker, they donned their Oculus headsets and got to work in the virtual world.

The team met regularly in a large, footprint-shaped studio that existed only in a virtual world entered through the Oculus. Of course, each team member was really in separate physical spaces, but with the help of the headsets and the immersive world, they felt as though they were meeting in real life.

The move dramatically slashed time spent creating an initial mock-up: from 21 days to less than one day, says Paul Sholz, Adidas senior footwear designer.

“In the design process, you create boards to inspire you and you brainstorm together. What we did in this virtual environment was the same, but we designed the actual product,” he says.

The Harden Vol. 5, the debut shoe in the Adidas Futurenatural line of molded, seamless sneakers. Credit: Adidas

Scholz and his colleagues spoke in November 2021 at the online Around Conference. The conference sponsor, Gravity Sketch, makes a 3D-design platform hosted in virtual reality, which is the tool Adidas used to help design its Futurenatural shoes. The company gave the same name to its range of tools accessible within the immersive environment.

The one-piece, seamless sneaker line just debuted, about 18 months after the design team’s initial virtual meeting. The Futurenatural sneakers are molded rather than sewn. That is, the upper is fused to the sole with high pressure and heat to create what looks like one continual shoe, with no obvious break between the top and the bottom.

Traditionally, footwear designs often work in two-dimensions, extrapolating 2D lines to form lateral views of the proposed shoe. But building-out designs in the 3D virtual environment makes a mockup materialize more quickly, says Robert Stinchcomb, Adidas creative designer. He played a lead role in bringing the virtual system into the company.

Designers wear Oculus headsets to design in 3D with Gravity Sketch software. They feel as if the design is floating in front of them, inside a virtual world, and they can easily make updates and changes to that design. Credit: Gravity Sketch

“Now it’s down to showing up at work at nine and at 3 pm having a mockup at the point where you could see everything and talk about ‘let’s switch the layering here,’” Stinchcomb says.

The mockup is an early-stage design “almost like a napkin sketch,” he adds. “This is a place we sketch out designs before fleshing them out, before we make a sample. And we’re doing it in a room that is super collaborative where we can talk to each other even though we may not even be in the same country.”

The team can quickly come up with 10 or 15 sneaker concepts, says Arnau Sanjuan, Adidas design director, footwear innovation.

“It’s easy to see how designs would look, to play around with them, to brainstorm ideas together quickly,” he says.

The Futurenatural studio looks much like a virtual reality game. Designers move about in the virtual world—moving between a series of “stations”— the same way they would any virtual-reality game in which avatars work together.

Gravity Sketch makes a 3D-design platform hosted in virtual reality, which is the tool Adidas used to help design its Futurenatural shoes.

The first stop is for design. Here, designers create the 3D model of the shoe. Surfaces are added at a second stop. Then it’s on to detailing and rendering. All before a physical prototype is created.

Because the shoe is easy to see and understand, the finished mockup can be immediately shared with manufacturers and marketing people for their feedback. They needn’t have an Oculus, as the designs can be captured and shared via other methods. Suggested changes are quickly made within the virtual environment.

James Harden’s foot
The mockup starts with the human foot. But for Futurenatural, the company took another tack. Like many shoemakers, the company had been using a generic last—the term for a 3D model of the foot—meant to represent the common sneaker wearer. For the Futurenatural line, Adidas wanted a better fit.

Adidas scanned thousands of people’s feet, including those of professional athletes. Of course, the popular shoemaker already had prints of athletes who have promoted their own Adidas sneaker in the past. James Harden, basketball player for the Brooklyn Nets is among those elite players. The Supernatural line debuted with the player’s fifth-signature basketball sneaker, Harden Vol. 5.

The engineers pulled together all types of feet—large sizes, small, narrow, wide, to best represent the foot. From that, they developed a new “last.”

Designers make their first foray into the Gravity Sketch virtual environment to fit the last with experimental sneaker concepts. Here is where they play with articulated lines in the 3D environment, rather than extrapolating view and fit from a 2D print, Stinchcomb says. They can rotate the view to see how the shoe would look, from the top, bottom, and sides.

At this first stop in their virtual environment, Stinchcomb and fellow designers work out new ideas for a sneaker’s footpad and play around with ways the upper might be molded and pressed. They sculpt arches and add padding to the sole in areas where the foot would benefit from reinforcement.

Collaboration is a key part of this design, with the designers talking back-and-forth in the virtual world as they gesture at parts and play around together with design, Stinchcomb says.

“We take a shoe and explode it and invite people into the space and spec out every single detail. We can blow it up to the size of warehouse and they can swim around the shoe, doing a deep dive on every part,” he says.

“At such an early stage, we can discuss complex details within the form,” he adds. In fact, these early iterations hold enough information to be fleshed out even further, which takes place at the next stage, or station: surfacing.

This is where the skeleton comes together and where volumes are defined, Stinchcomb says. Here, designers wrap their shoe to simulate the material they have in mind for end use. At this step, they create a continuous, lifelike surface with the help of SmoothKit software to sharpen effects.

The team also uses Adobe Substance Painter to “get the feel of the material” and to shade the image so it looks “as realistic as possible,” says Marius Jung, senior design.

Because the footwear industry makes heavy use of Adobe Photoshop and Illustrator, these new tools were a bit of a departure for the team, he says.

“In the past, we’d spend hours creating the right shadows and lighting, and now we’re able to speed that up and dive right into detailing like we’ve never been able to before,” he says.

When designers are satisfied with the shape and look of the shoe, they move to the next area within the virtual design space. At this juncture, they add details like laces and lace loops to their continuous surface. The team then renders the illustration with KeyShot software to give the image a photorealistic, lifelike quality.

At this stage, the team can share the image with other Adidas departments, mainly marketing and manufacturing. These teams offer their suggestions long before a final virtual prototype, much less a physical prototype, is created, says Marius Jung, senior designer. Their input is important, because the Supernatural line is a step apart from the usual. Designers need to know, and need to know early: can the manufacturer make a mold for this shoe using the designated materials? Will buyers be delighted or dismayed with this form for a new integrated sole?

Members of those teams can be invited into the virtual world if they have access to an Oculus. If not, the images can be shared on a desktop, Jung says.

Adidas worked with one of its factories to develop a new production process for the new shoe. During design, representatives from that manufacturer weighed in with tooling ideas. They also offered feedback about how they might produce the welting and lace loops. Marketers made suggestions brand placement and other features.

Mutual maker space
The Futurenatural design team had been working together almost a year in March 2020 when the COVID pandemic forced many companies to move employees to home offices. Some engineering and design businesses stuttered a bit as they found new ways to collaborate outside an office.

Even people regularly tied by collaboration software might have felt a hiccup as they accessed software on their home computers, in their home spaces. Meanwhile, he and his Adidas teammates stepped right back into their familiar space -the virtual office and maker space within the virtual shoe, says Arnau Sanjuan, design director of footwear innovation.

“I’ve always been one to be in the workshop figuring things out with my hands and working with materials,” he says. “I found my work in 3D could replace those things. We work together in that world so closely.”

Scholz too emphasized the inventive atmosphere that prevailed within the digital footprint.

“The virtual space kept the creativity and the spirit alive during the pandemic,” he says. “It’s just a fun, intuitive and playful way to create serious products.”

And that playfulness showed with the debut of the Harden Vol. 5 in January 2021 and the ensuing Futurenatural products, which feature polka dots, splotches and paint-like splurges in a number of patterns and colors, wavy soles, and an upper that melds seamlessly with the bottom of the shoe for an almost sock-like look.

In the future, the line is expected to include more materials and new designs. The shoes will, of course, be designed within the digital shoeprint using Gravity Plus 3D design technologies.

“The virtual reality system definitely demonstrated its value,” Sanjuan says. “Now everyone wants to try it. Because the learning curve is so easy, it’s spreading like wildfire to put 3D in anyone hands who wants it.”

Those newcomers are welcome, he adds.

“Especially at big, grand company like Adidas, it’s important to inject new processes into footwear and to look at things in a different way, Sanjuan says.

Filed Under: Simulation Software, VR software

Varjo unveils reality cloud platform for capturing and sharing our reality for a true-to-life metaverse

June 28, 2021 By WTWH Editor Leave a Comment

Varjo, the leader in industrial-grade VR/XR hardware and software, today announced a pioneering new reality with Varjo Reality Cloud. The new platform will enable virtual teleportation for the first time by allowing anybody to 3D scan their surroundings using a Varjo XR-3 headset and transport another person to that same exact physical reality, completely bridging the real and the virtual in true-to-life visual fidelity. This real-time reality sharing will usher in a new era in universal collaboration and pave the way for a metaverse of the future, transforming the way people work, interact, and play.

“We believe that Varjo’s vision for the metaverse will elevate humanity during the next decade more than any other technology in the world,” said Timo Toikkanen, CEO of Varjo. “What we’re building with our vision for the Varjo Reality Cloud will release our physical reality from the laws of physics. The programmable world that once existed only behind our screens can now merge with our surrounding reality – forever changing the choreography of everyday life.”

For the past five years, Varjo has been building and perfecting the foundational technologies needed to bring its Varjo Reality Cloud platform to market such as human-eye resolution, low-latency video pass-through, integrated eye tracking and the LiDAR ability of the company’s mixed reality headset. As the only company having already delivered these building block technologies in market-ready products, Varjo is uniquely positioned to combine them with Varjo Reality Cloud to empower users to enjoy the scale and flexibility of virtual computing in the cloud without compromising performance or quality.

Using Varjo’s proprietary foveated transport algorithm, users will be able to stream the real-time human-eye resolution, wide-field-of-view 3D video feed in single megabytes per second to any device. This ability to share, collaborate in and edit one’s environment with other people makes human connection more real and efficient than ever before, eliminating the restrictions of time and place completely.

To further accelerate bringing the vision for Varjo Reality Cloud to life, Varjo today also announced the acquisition of Dimension10, a Norwegian software company that pioneers industrial 3D collaboration. Their virtual meeting suite is designed for architecture, engineering and construction teams and will become a critical component to making virtual collaboration possible within Varjo Reality Cloud.

Additionally, Varjo welcomed Lincoln Wallen to the company’s board of directors. Wallen currently serves as the CTO at Improbable, is a recognized scholar in computing and AI, and brings to Varjo his extensive knowledge of large scale cloud computing, and moving digital content production into the cloud. Previously, Wallen has worked as CTO of Dreamworks where he transitioned global movie production to the cloud, including the development of a cloud-native toolset for asset management, rendering, lighting, and animation.

Varjo
varjo.com

Filed Under: News, VR software Tagged With: varjo

A view on where AR/VR is headed, roundtable discussion from those who know

February 12, 2021 By Leslie Langnau Leave a Comment

Recently, Ron Fritz, CEO of Tech Soft 3D, hosted a roundtable discussion with five other industry executives to discuss the current state of augmented reality (AR) and virtual reality (VR). The core question at hand: whether AR/VR is finally poised for its breakthrough moment – and if so, what barriers might need to be removed to usher in this new era.

The participants included:

– Asif Rana, COO of Hexagon, a provider of sensor, software, and autonomous solutions

– Martin Herdina, CEO of Wikitude, an augmented reality technology company

– Susanna Holt, VP Forge Platform, Autodesk, a provider of 3D design and engineering software

– Thomas Schuler, CEO of Halocline, a developer of VR products for production planning and manufacturing

– Tony Fernandez, CEO of UEGroup, a user experience agency

A lightly edited and condensed version of the conversation and their unique perspectives follows.

Q: At various points over the past decade, many of us have believed that AR/VR was ready to really take off in the industrial setting – but it hasn’t happened yet. What are the barriers that are standing in the way of that widespread adoption, and what should the industry be focusing on?

Asif: One of the fundamental things that we tend to forget when we think about commercializing a technology is the user experience. I think one of the main hurdles of AR/VR in the commercial usage is we don’t think about the full user journey or what the full end-to-end solution looks like.

Martin: For a while, there was such a focus on technical benchmarks that nobody really talked about what could be achieved with AR/VR. Even when people did start to talk about what could be achieved, they didn’t really look at the full picture and at how things could be scaled beyond a single isolated use case. As long as that underlying basis is missing, widespread adoption of AR/VR will be hampered.

Susanna: I think one thing that’s lacking around AR/VR is pre-processing of data and data preparation – from CAD design data, to mesh poly count reduction. That kind of stuff needs to be automated, robust, fast, and scalable. And at the moment, all of that still seems to require too much manual work to really enable this AR/VR takeoff that we’ve been anticipating for the past 20 years.

Tony: I think the core issue is that AR/VR did not emerge from a human-centered point of view. It emerged from a technological exploration point of view. And what that has meant is that the human factors of this technology are terrible.

To take the case of VR: Who thought it was going to be a great idea to duct tape a TV to your head and blindfold you? Meanwhile, with AR, one of the problems that we continually run into is arm and body fatigue from having to hold up a device. Because AR/VR technology hasn’t centered around the reality of the human body, how it gets fatigued, and how people feel motivated to use their bodies, it will continue to have a difficult time breaking through to the mainstream, regardless of the value proposition it may offer.

Q: From what everyone’s saying, it seems that the user experience is one of the big barriers to mainstream adoption. What needs to be different for people to feel comfortable? How can companies remove this barrier?

Tony: I think mobile AR is a really difficult problem to solve. And again, part of the problem with most existing AR solutions is that they require people to use their bodies in unnatural ways. From a hardware perspective, we’re going to be much closer to solving that problem once we get to some sort of compact glasses. Of course, glasses come with their own problems around power and where to place the battery and so on. But I think that’s what AR’s waiting for, in terms of a hardware platform solution.

Asif: I wonder whether there are the same expectations on an enterprise level as at a consumer level for AR/VR. I say that because in the enterprise, you do see technology that’s not so comfortable to use – but it delivers such a high value that it’s used anyways. So, perhaps the AR/VR hardware is “good enough,” and it’s the content side that deserves more focus to deliver applications that can really make an impact and deliver value. Either way, I’d say that if the hardware companies focused on more business cases, that would be helpful to the enterprise sector.

Susanna: It’s true that the enterprise use case may put up with all sorts of inconveniences. But when I think of a use case for us at Autodesk, which might be an architect or structural engineer at a construction site or building site, inconvenience can quickly become a safety concern. AR provides a limited field of vision. In normal life, we don’t just look straight ahead – we’re constantly taking in things occurring on the periphery. Excluding that visual information in a potentially dangerous environment like a construction site does strike me as a risk factor. So, the hardware has to be natural to the way we conduct ourselves as humans in a particular environment.

Martin: I think the most important point that people have hit on is that things have to feel natural. When you wear a HoloLens, it’s cool, but it’s nothing that you would want to wear for 10 hours per day at your workspace. Another aspect that companies should address is the fact that so many AR use cases totally lack context. For example, why would you use AR to project a team roster on your desk when there are so many other user interfaces that make so much more sense for that objective? AR needs to really link reality to a reasonable set of content.

Q: Lots of big names – including Google, Apple, Facebook, and Microsoft, to name a few – are heavily investing in the belief that the barriers around AR/VR adoption are being resolved and that this an area that is ripe for explosion. All of your companies are, to varying degrees, investing in that belief as well. What makes you optimistic that AR/VR is getting close to a real breakthrough? What drives your confidence?

Thomas: It takes a long time to bring hardware technology from an early prototype to a usable product. You have to really keep at it for quite some time. What makes me optimistic is that the hardware vendors are still investing in it and pushing it forward – they’re not standing still.

At the same time, more and more content is now being produced that makes more sense. I think more people understand now that you need a different set of tools for AR or VR rather than taking the same old tools that you had before, but just manipulating them differently. So, while the progress might be slower than everyone expected, that progress is very much ongoing. That makes me optimistic that we are on an eventual path towards more widespread adoption for AR/VR.

Susanna: Well, let me turn this question the other way around. We’re hearing so much from our customers about how AR or VR is needed and how they’re expecting it to play a bigger role in their workflows. Some of that, of course, is a reflection of hype that they see in the media, but a significant proportion of it is a reflection of real need.

For example, while wearing a HoloLens headset might be uncomfortable today, it does allow you to make those important decisions much faster than having to look at something, take a photograph, go back to the office, think it through, discuss it, and so on. It will speed everything up. It’s about faster decisions, better decisions. There’s a real need in the market – so that bodes quite well for AR/VR, because a lot of technological advancement and evolution is driven by market need.

Tony: I would say AR/VR will break through if it can focus on its fundamental promise, which is to reveal information and perspectives in ways that would be difficult to do any other way. I’m not necessarily a believer that the way most companies have defined AR at this point is necessarily the path forward. For example, AR doesn’t necessarily always have to be visual in nature, right? It can be haptic in nature. It can be lots of other things. But visual is the primary road for now, and I think the need to visualize information that is otherwise difficult to do any other way or get access to any other way is going to drive the solution.

Martin: At my company, we perhaps have a unique perspective, because we have thousands of developers using our tools on a daily basis to create AR use cases, and we can see what those people are working on. The things they are doing today with AR are substantially different from what we saw two or three years ago. There are still people working on proof of concepts, but the number of people who are moving from POC to commercial grade installations – and the number of use cases we see that are no longer for two or three or five users, but 10,000 to 20,000 users – has rapidly increased in the past year.

Also, from a finance perspective, AR is no longer tapping into the budgets of the innovation units – it’s tapping into the budgets of the actual business units. That’s the ultimate sign that technologies like AR/VR are starting to take hold in the enterprise space.

Asif: There are at least three reasons why I’m very feeling positive about AR/VR. The first is the acceleration of digitalization that has taken place as a result of the COVID-19 pandemic. Many, many systems are getting digitally transformed, and digital journeys that might have taken years to complete are now on the fast track. So, the ground is really set for AR to make a move.

The second reason is that digital process management has really evolved. The journey really starts with connectivity first, then it goes to the integration, then it goes to the digital workflows. Once you have the workflow, to augment the workflow with AR is very straightforward.

The third reason is the advent and proliferation of smartphones and tablets that are loaded with the sensors and features that are required for AR/VR. These devices are now at everyone’s fingertips, ready to be used for various advanced workflows. So, really, I think the time is very, very good right now for AR/VR.

 

Filed Under: Autodesk, Hexagon software, News, VR software Tagged With: techsoft3d

Integrating desktop applications inside immersive VR/XR environments

December 23, 2019 By Leslie Langnau Leave a Comment

Varjo (Shadow in Finnish) Technologies, a leader in enterprise-grade VR/XR headsets, announced it has developed a 2D/3D immersive user interface. Code-named ‘Varjo Workspace,’ the company’s Dimensional Interface allows professionals to easily use Microsoft Windows applications and 3D software tools within a human eye-resolution VR or AR environment. With Varjo Workspace, users can seamlessly switch between real, virtual and mixed reality modes and modify their creations while experiencing them in 3D.

Varjo’s Dimensional Interface gives access to the Microsoft Windows desktop at any size and with extreme resolution. All of this is made possible through Varjo’s human eye-resolution capabilities, as no other headset can display readable text on a virtual screen or be able to mix the virtual and real worlds seamlessly together. With Varjo Workspace, it’s now possible to have infinitely adjustable multiple monitors and to work simultaneously within 2D and 3D worlds.

The user interface can be experienced with the company’s XR-1 Developer Edition headset, bridging the current 2D UI with the photorealistic 3D world from Varjo’s video pass-through-based mixed reality. Professionals in design, engineering and training can work within an immersive VR/XR environment while simultaneously using existing desktop applications. Through Varjo’s Dimensional Interface, professional users can experience and modify their 3D models without ever taking off their headset.

For example, Varjo Workspace can be used in the automotive industry to modify a 3D car model using existing CAD and visualization tools like Autodesk VRED, Unity or Unreal while simultaneously observing the model in mixed or virtual reality.
Varjo Workspace is shipping to customers and partners as part of the software delivered with the XR-1 Developer Edition. The XR-1 Developer Edition headset is available for purchase immediately at $9,995 (USD and Euros), and is sold together with Varjo’s Software and Support service at $1,995 (USD and Euros). Varjo plans to implement customer and partner feedback in 2020 to further deepen its integration with professional design, engineering and simulation tools.

Varjo
varjo.com

Filed Under: VR software Tagged With: varjo

Meshmatic optimizes CAD files for real-time visualization

December 4, 2019 By Leslie Langnau Leave a Comment

Meshmatic is a 3D optimization software that helps engineers prepare design files for real-time visualization faster. A challenge with many large and complex engineering files is having to manually clean and optimize them for 3D rendering or AR/VR development. Such tasks are often repetitive and tedious, as well as prone to human error.

Meshmatic is standalone software that automates tedious optimization tasks and helps companies save time when cleaning up heavy and complex design files for simulation and AR/VR applications. It works with applications such as Keyshot and V-ray or game engines like Unreal Engine or Unity.

The software’s proprietary algorithms mathematically calculate the rotation, position, and scale of each of the 3D objects in a scene and groups them for duplicate instantiation or further modification. This method of calculation is useful for CAD conversions such as FBX or OBJ, with reset transformation.

Meshmatic reads 3D data from common file formats such as FBX, OBJ, STL, Sketchup, STEP and more, and processes them as polygonal mesh without harming shape integrity. Multiple data clean-up tools help users efficiently reduce file complexity without compromising the quality of the visualization or accuracy of the data.

In addition to its mesh, hierarchy clean-up and duplicate optimization tools, Meshmatic is equipped with data analysis tools to identify bottlenecks and errors in a file, which can be resolved with suggestive actions. The software also provides updates on file parameters such as file size, vertex count, face count, and more to help users track performance improvement during the clean-up and optimization process.

VRSquare
meshmatic3d.com

Filed Under: VR software Tagged With: vrsquared

Primary Sidebar

3D CAD NEWSLETTERS

MakePartsFast

Footer

3D CAD World logo

DESIGN WORLD NETWORK

Design World Online
The Robot Report
Coupling Tips
Motion Control Tips
Linear Motion Tips
Bearing Tips

3D CAD WORLD

  • Subscribe to our newsletter
  • Advertise with us
  • Contact us
Follow us on Twitter Add us on Facebook Add us on LinkedIn Add us on Instagram Add us on YouTube

3D CAD World - Copyright © 2022 · WTWH Media LLC and its licensors. All rights reserved.
The material on this site may not be reproduced, distributed, transmitted, cached or otherwise used, except with the prior written permission of WTWH Media.

Privacy Policy