AI in design – The interplay between algorithms and human hands

Christian Waldhütter • 10.12.2025
  • Visualization
  • Research
  • Automotive
  • Architecture
  • Industrial

Transformation with control?

Artificial intelligence is no longer just an experiment for developers, it has arrived in the world of design and has the potential to fundamentally change the process from the first sketch to the final rendering. But whether for product designers, architects, or visual artists, the integration of AI is causing controversy. The ability to generate complex visualizations or numerous iterations within seconds conflicts directly with questions of authorship, control, and the definition of one's own creative identity.

So how can this powerful tool be controlled and meaningfully integrated into the design process without losing control and one's own signature style?

AI as an efficiency tool

By exploring a wide range of stylistic variations and conceptual ideas, AI enables much faster and broader content generation than traditional brainstorming can achieve in the same amount of time. Routine processes can also be made more efficient and automated in the early stages of the design process. AI-supported rendering pipelines deliver precise and visually compelling results from sketches or 3D data alone, replacing time-consuming visualizations. With the time saved, designers can shift their focus to conceptualizing the customer approach, developing the visual vision, and ensuring creative authenticity.

However, this transformation in design through AI does not devalue craftsmanship but rather rebalances the work steps. Human decision-making, and above all its intention, will and must remain the highest authority.

Controversy and loss of control

Even though the results of AI-supported tools are impressive, they raise practical and ethical questions. The training of AI models is often based on gigantic data pools, and this is precisely where the danger arises that stylistic boundaries become blurred and creative authenticity is lost in the flood of generated output.

Furthermore, in most cases, the internal decision-making processes of these algorithms are not transparent. For designers, however, it is essential to understand not only what has been created, but also how it was created in order to pursue their own vision.

So, when generated content is created without specific intent or human intervention, the corresponding loss of human touch is an inherent consequence. It is important to keep this understanding in mind in the digital workflow in order to maintain maximum control over AI output. The more powerful the tool, the more precise the control instrument must be to prevent the loss of the human touch.

Craftsmanship as the key

Despite the fascination and efficiency of AI-generated images, manual drawing remains an indispensable practice in the digitally dominated design process. Guiding a pen, whether on paper or a digital drawing tablet, trains the eye and hand in a way that pure prompt engineering can never achieve. Only through years of drawing practice can an intuitive understanding of form, composition, light, and shadow be internalized. Designers thus possess the ability to modify AI outputs with the necessary experience and correct deviations. Where algorithms only deliver the most probable result, learned drawing skills bring specific intention and focus to the vision.

Skills such as tracing convincing textures, accurately positioning reflections, or correcting faulty perspectives are necessary corrective measures for generated output and are anything but superfluous. Those who have mastered the craft can make targeted use of the tool's potential, manipulate it, and preserve their own creative identity in the final work.

Wacom precision and control

We use the Wacom Cintiq 16 for our work with generative AI. The graphics tablet bridges the gap between the generative power of algorithms and the emotional, purposeful control of the designer.

Instead of relying on text prompts, precise hand-drawn sketches give the AI clear direction. During digital drawing, pressure sensitivity and intuitive use are crucial factors in seamlessly conveying the designer's ideas to the generative power of algorithms.

Tools such as inpainting or outpainting in AI systems can be used for targeted corrections of generated images. Precise masking and brush control are important for replacing texture details or faulty areas accurately and quickly. Here again, traditional craftsmanship is required to integrate drawings correctly.

Color corrections, light adjustments, highlights, and intentional brush strokes are essential to bring the overall AI output a step closer to creative identity. A graphics tablet remains necessary as the designer's control device for crucial nuances in the process and the final, qualitative refinement of generated content.

Conclusion - The pen as a tool for intentionality

Artificial intelligence is a tool and catalyst in the design process, but it is not and will not replace creativity. Effective use requires not only technical understanding, but above all control and craftsmanship. Experience and drawing skills remain indispensable for understanding AI outputs and manipulating the process in the right places.

In the end, it will still be the guided pen that determines the visionary accuracy and distinctive signature of the designer in the design, and not the randomness of the algorithm.

Christian Waldhütter
With a background in product and transportation design, Christian is fascinated by design processes and how AI can expand the boundaries of creative work, exploring what's possible and what's not (yet).

More topics

Creative Stage – The Artistic Freedom Plugin for Autodesk VRED
Autodesk VRED has long been an indispensable tool for 3D visualizations, but anyone who works with VRED on a regular basis is also familiar with the complexity of the programm. Variant set management is essential for structured and reproducible work, but the current process slows things down considerably. We changed this technically dominated workflow back into a creative process with Creative Stage.
Radspur – Mit Daten zu besseren Radwegen
Die Rad-Infrastruktur ist schon lange nicht mehr nur ein Nischenthema in der Verkehrsplanung. Sie hat Einzug in das Zentrum moderner, urbaner als auch suburbaner Lebensqualität gehalten. Doch vielerorts stehen dem Ausbau, vor allem in ländlicheren Gegenden, Herausforderungen wegen fehlenden oder unvollständigen Informationen bevor. Genau hier setzt unser Pilotprojekt an!
Radspur – Using data to improve bike paths
Cycling infrastructure has long since ceased to be a niche topic in transport planning. It has become an integral part of modern urban and suburban quality of life. However, in many places, especially in more rural areas, there are challenges to its expansion due to a lack of or incomplete information. This is exactly where our pilot project comes in!
AI in design - Local-based tools in the spotlight
AI tools are reshaping how we design, create and think but not all of them are built the same. Different AI tools represent different approaches to AI integration into the design process. There are options ranging from user-friendly platforms to complex, highly customizable frameworks. But which tool fits your project the best?
AI in design - Between creative revolution and digital responsibility
Artificial intelligence is no longer just a topic for research or software development; it has found its way into design practice. However, the enormous creative potential is in direct tension with questions of control, originality, and responsibility. The central challenge is: How can AI be meaningfully integrated into creative workflows without losing one's own creative identity?
Autodesk VRED: Integration of Depix AI
Our new plugin for Autodesk VRED Professional introduces generative AI in the high-end 3D visualization workflow. It allows users to generate photorealistic renders of CAD data directly in the viewport, streamlining the prototyping and rendering process. With faster iterations and refined details, it offers a seamless way to enhance visual quality, ideation and decision making. By integrating AI-driven rendering, it brings a new level of efficiency and realism to design and visualization workflows.
Webcast: Realtime HMI-Design with VRED 2024
In this Webcast, organized by Product Innovation Lounge by MFS, Christopher Gebhardt shows how Autodesk VRED allows us to stream online design tools like Figma directly into a real-time car visualization. This allows designers to test their designs as early as possible in virtual reality setups and adjust elements in real-time.
Webcast: Was gibts neues in VRED 2024?
Autodesk VRED's 2024 release introduced a complete overhaul of the user interface and other nice features. With Alias and VRED complementing their workflows for years, they now also visually integrate seamlessly, offering a unified and intuitive design experience for users.
Autodesk VRED: Post-Processing Volumes with Metadata
Autodesk VRED's 2023.3 release introduced Metadata services, allowing users to import and use metadata in VRED scenes. In this blog post we explore how to use metadata to implement a post-processing volume that controls camera parameters in real-time demonstrations.
Autodesk VRED: Real-Time Tutorials
Designers and engineers use Autodesk VRED, a powerful industry tool, to create high-quality 3D visuals and handle complex CAD data. Various techniques and settings can be used to optimize the performance and quality of realtime renderings, particularly in virtual reality. We were commissioned by Autodesk to produce a video tutorial series that explores realtime settings that can be customized to present virtual prototypes in the best possible way.
Autodesk VRED: Python Scripting Tutorials
Autodesk VRED is a powerful tool for designers and engineers to handle CAD data, create interactive presentations and validate designs. Its real-time engine, extensive Python API and new features simplify access to core functions. We were commissioned by Autodesk to produce a video tutorial series covering all the latest possibilities, from working with the new API to building a render pipeline with VRED Core.