Four Directions for the Digital Transformation of Architecture
A brief description of the page goes here.
Industry 4.0
The necessity of digital transformation
Toward an opportunity for leaps
A change in “approach” beyond “tools”
The construction industry has traditionally been considered one of the sectors slowest to embrace digital transformation compared to other industries. This was due to its field-centered, labor-intensive nature and fragmented supply chain structure. However, the powerful wave of the Fourth Industrial Revolution is now knocking forcefully at the door of the once-solid construction and architectural design fields. The shortage of skilled workers due to population decline, carbon neutrality demands driven by rapid climate change, and increasingly complex building performance requirements have reached a limit that can no longer be resolved through conventional analog methods.
Moreover, the advent of AI has accelerated the timing for problem-solving. Artificial intelligence and big data have made it possible to efficiently control numerous variables. They process repetitive tasks accurately and quickly. Digital transformation has opened a tremendous “window of opportunity” to create advanced productivity and new value added. The time has come for the architectural design field to create an inflection point and leap beyond the limitations of the era. Junglim Architecture is now undergoing digital transformation (DT), questioning the familiar methods and processes it has employed thus far. Beyond merely changing tools, we are preparing for a new era of architecture through data-driven decision-making and process innovation. We propose four directions for the digital transformation of architecture. Starting from these four directions, we hope to contemplate and advance together toward better architecture.
Written by & Courtesy of Architect Sungwoo Ahn (Leader of the New Tech SU, Junglim Architecture)
Edited by the Brand Team of Junglim Architecture

01 BIM
Beyond simple 3D modeling to an “integrated information platform”
The digitization of design work revolves around building information modeling (BIM), which is not merely a tool that converts 2D drawings into 3D forms for visual presentation, but one that delivers all information required throughout the architectural process in the best possible way. In other words, BIM is defined as a platform that encompasses all information generated over the building’s life cycle, from planning and design to construction and facility management (FM).
The systematic accumulation and utilization of BIM data will elevate the standard of design quality and serve as a foundation of trust by providing transparent and accurate information in practice and decision-making processes.
- Streamlined collaboration
Interference between trades such as architecture, structure, mechanical, and electrical is checked in advance to dramatically reduce design errors.
- Data continuity
Building an integrated pipeline is pursued so that data from the design phase can flow seamlessly into construction estimating and schedule management without interruption.
02 Eco-friendly Design
From intuition-based to “data-driven design”
Sustainable architecture is no longer a choice but a necessity. While past eco-friendly design relied on designers’ intuition and experience, eco-friendly design in the DT era is grounded in clear “data” and “simulation”. Advanced environmental simulation techniques need to be actively introduced from the initial design phase of projects.
This approach not only provides clients with objective credibility, but also implements Junglim’s philosophy of fulfilling social responsibility through concrete technology.
- Quantitative analysis
Site airflow, solar radiation, light reflection, and energy performance are analyzed numerically to derive optimal layout and form.
- Evidence-based design
To the question “Why should the building take this form?” we provide rational answers not only through aesthetic value but also through quantified carbon emission reductions and energy efficiency.
03 Computational Design
The power of algorithms to control complexity
Contemporary buildings are becoming increasingly complex, demanding diverse capabilities from architects. To accommodate these circumstances while securing constructability and cost-effectiveness, the adoption of computational design is essential. Algorithm-based parametric design must be actively utilized. This is a core process that allows designers to move beyond simple, repetitive modification work and fully focus on creative ideation and improving design quality.
- Real-time feedback and flexible modification
Adjusting parameters enables the real-time control of the entire model’s geometry. This provides powerful flexibility to quickly review numerous design alternatives and immediately reflect client requirements.
- Consistency and data-driven efficiency
Shape data for curtain wall panels or structural members are automatically optimized to generate precise 3D data that minimizes construction errors.
04 Design Automation and AI
A “Smart Partner” that extends human creativity
Simple, repetitive, and rule-based design tasks are being dramatically improved through automation tools and artificial intelligence. AI is utilized as an auxiliary tool that extends designers’ capabilities. The introduction of AI shortens the design period and also serves as an opportunity to derive innovative design solutions that could not be discovered through familiar perspectives.
- Generative Design
AI-powered image generation assists in initial conception and architectural visualization proposals. Furthermore, by learning site conditions, regulations, floor area ratio, and other constraints, it instantly proposes massing study alternatives (mass studies, parking layouts, area schedules, etc.).
- Optimal decision-making
Designers perform the role of sophisticated decision-makers who select and develop the optimal solution from numerous options proposed by AI.
The essence of change
Lies in the “process,” not the “tool”
The ultimate goal of digital transformation pursued by Junglim Architecture is not simply using the latest software or flaunting flashy technology. The true essence of change lies not in “which tools to use” but in “how we work” — in other words, fundamental innovation of the work process. Digital tools can break down team barriers and create a communication culture where data flows transparently among all stakeholders. If a culture becomes established where thinking is grounded in data and collaboration occurs organically in cloud and digital environments, we can move toward ideal digital transformation. Through technology-based architectural design, Junglim Architecture seeks to create “healthy spatial environments,” supporting “a world of living together.”
Intro
a point when AI development accelerates and a super-intelligence emerges that cannot be surpassed even by all human intelligence combined
– “Technological Singularity”, Wikipedia
The world is paying close attention to the development of artificial intelligence. Not only private companies but also public institutions — across all fields, from national governments to academia and industry — are concentrating their policies and projects on this single technology. Mathematician Vernor Vinge introduced the concept of the “technological singularity” in his 1993 essay, predicting that the rapid advancement of AI would one day surpass human intelligence and lead to an irreversible point of transformation. Indeed, the current pace of development suggests that such a singularity may be approaching, just as he foresaw.
Junglim Architecture has created healthy architecture and spatial environments in step with the changing times. As technology and everyday life have changed, new forms of architecture have emerged while familiar ones have continued to evolve. For nearly 60 years since its foundation, Junglim has carried out numerous projects and accumulated a wealth of experience in responding to change — most recently, feeling the growing influence of AI technology. AI opened a new era for the “data center” architecture. From small-scale edge data center sites to hyperscale data centers optimized for AI technology, their functions and forms have become increasingly complex and diverse. Accordingly, demand for the development and construction of data centers is also rising. In this age when everything is being transformed into data, we must study “data center” architecture as a crucial foundation for protecting personal information, safeguarding national security, and securing data sovereignty.
Written by & Courtesy of Architect Jeongtaek Ahn (Leader of the Big-Tech BU, Junglim Architecture)
Edited by the Brand Team of Junglim Architecture

The core functions of AI data centers: training and inference
The global IT company IBM is explaining the AI data center in this way: “An AI data center is a facility that houses the specific IT infrastructure needed to train, deploy and deliver AI applications and services.”
What are the core functions required of the AI data centers in terms of end-users? AI data centers have to provide the functions of training and inference for the AI model. This is the largest difference from general cloud data centers.
- Training: The autonomous vehicle and robotics industries are representative examples. Companies such as Tesla, Waymo, Uber, and Hyundai Motor are continuously improving functions such as image recognition, object detection, path planning, and control through AI model training to enhance the performance of their autonomous driving systems.
- Inference: In the medical industry, AI inference is used to diagnose diseases, recommend treatment methods, and predict patient conditions through the analysis of medical images (such as X-rays, CT scans, and MRIs), genetic data, and patient records. Representative companies include IBM Watson Oncology and Google DeepMind Health.
The necessity of building AI data centers to ensure data sovereignty
Currently, there is a shortage of AI data centers in Korea capable of supporting AI technologies. Many leading domestic AI-based industries depend on overseas data centers and platforms. This situation leads to the issues of data sovereignty,1 and the government is concerned about the potential data monopoly or oligopoly held by a small number of global corporations. Such concentration of power could give rise to a phenomenon like Orwell’s Big Brother.2 Therefore, establishing domestic AI data centers has become an urgent task to ensure data sovereignty and strengthen the competitiveness of the national AI industry.
Considerations in architecture and engineering to build AI data centers
“Training” and “inference” require highly intensive computational processing, with training demanding even more advanced technologies. From an architectural and engineering perspective, designing an AI data center involves careful consideration of high power consumption, next-generation cooling systems, high-speed networks, and heavy equipment loads. Clusters are composed of GPU servers instead of CPU-based servers, and parallel computing3 is essential for performing large-scale computational processing simultaneously. Accordingly, the design must focus on providing computer rooms (white spaces) suitable for GPU server clusters and systems capable of efficiently cooling high-heat–generating equipment.


The number of GPU servers as a measure of capability
The country with the largest number of paid ChatGPT subscribers worldwide is the United States, with South Korea coming in second. This is remarkable considering that South Korea is not a highly populous country. It indicates that AI services like ChatGPT and Gemini are becoming increasingly inseparable from our daily life. A U.S. talk show host joking that “writing emails is nearly impossible without ChatGPT” reflects this reality.
It is estimated that training OpenAI’s ChatGPT-4 required approximately 8,000–12,000 NVIDIA H100 GPUs. The South Korean government has also announced a goal to secure over 10,000 high-performance GPUs by the end of 2025, accelerating the development of AI infrastructure. Currently, the total number of GPU servers in the public and private sectors is in the thousands, but this is expected to expand rapidly into the tens of thousands. This suggests that domestic data centers are currently insufficient to provide services like ChatGPT purely with local technology and infrastructure.
What exactly is a GPU? Until a few years ago, CPU performance was the primary consideration for anyone purchasing a computer, but in the AI era, GPUs have become increasingly important. Parallel computing, AI which is essential for AI training, is much more efficiently handled by GPUs than by CPUs. NVIDIA is the leading player in the GPU market, and most AI data centers focus on acquiring the latest GPUs. The number of GPU servers held by a data center serves as a key measure of its AI capabilities.

The rise of liquid cooling technology
When equipped with the aforesaid NVIDIA H100 GPUs required to operate ChatGPT, the IT power of a server rack is about 40 kW. This is 4–5 times higher than that of a typical cloud data center rack. Such high power density generates immense heat. If this heat is not adequately dissipated and exceeds allowable limits, it can lead to reduced performance, errors, and in severe cases, forced shutdowns. Therefore, liquid cooling technology is essential for efficient and stable data center operation.
Currently, the most widely discussed liquid cooling systems are classified into three types: rear-door heat exchangers, direct-to-chip liquid cooling, and immersion cooling. When you design an AI data center, you must first determine the number of GPU servers and per-rack power, and then design mechanical, electrical, plumbing, and fire protection (MEPF) services. In particular, when you design a liquid cooling system, it is essential to review alternative liquid cooling technologies.
Although liquid cooling technology is actively discussed in domestic AI data center development, most implementations remain at an experimental scale, and very few cases exist where liquid cooling has been applied at a commercial scale. As the domestic AI data center market grows, the adoption of liquid cooling technology is expected to expand further.

The future of data centers
Concentration and distribution, and healthy technology
The conclusion of a data center seminar often turns toward visions of the distant future. When contemplating the far future of data centers, I imagine two extreme scenarios: one is a mega-scale model where all data is centralized, and the other is a fully decentralized model in which data converges on individual devices. In the latter case, we can imagine an innovative structure in which distributed personal devices dynamically cluster and operate like a collective intelligence to process data as needed.
“Almost all software to date has been developed by corporations and states. Such software is inherently centralized. (…) The more data accumulates at the center, the more power the center holds. (…) The collected data strengthens artificial intelligence, and the strengthened AI ascends to become a predator within the ecosystem. (…) The reinforcement of the center and the weakening of the periphery can be observed in industries, economies, and ways of life where AI has been introduced. AI brings immense power to the corporations and states that own it, while weakening those that do not, including other corporations, states, and individuals.”
– Ju Young-min, Virtual is Real, Across, 2019
There are ceaseless discussions on technological neutrality, but when human intent intervenes, technology can exhibit bias. Could it be that we are entering an era in which architects’ philosophical reflections on the ethical responsibility of technology become even more crucial? Junglim Architecture pursues “healthy spatial environments.” In this context, what role can architects play in fostering the development of “healthy technology”? Is it even possible for “healthy technology” to exist?
I believe the architect’s role extends beyond designing physical spaces; it also encompasses considering the impacts of technology on humans and society. From an architectural perspective, exploring the ethical and social implications of technology, and seeking ways to contribute to the creation of a healthy technological ecosystem, will be a key challenge for the architecture of the future.

Data centers designed by Junglim Architecture
- Naver Data Center Gak Sejong (2023)
- SK Broadband IDC, Gasan (2021)
- ST Telemedia GDC, Gasan (to be completed in 2025)
- KAAM Square IDC, Ansan (under design)
- Jikjilabs AI Data Center (under design)
- Data sovereignty refers to the concept that data is subject to the laws and governance structures of the country in which it is collected, Wikipedia ↩︎
- Big Brother is the absolute ruler appearing in George Orwell’s dystopian novel 1984, and in modern society, the term is used as a symbol of privacy invasion or excessive control. ↩︎
- Parallel computing is a computational method that performs many calculations simultaneously. It is primarily used to divide large and complex problems into smaller parts and solve them in parallel, Wikipedia ↩︎