WeatherCity: Urban Scene Reconstruction with Controllable Multi-Weather Transformation

WeatherCity is a novel framework that enables flexible, high-fidelity, and temporally consistent 4D urban scene reconstruction with controllable multi-weather transformations by combining text-guided image editing, a shared-feature weather Gaussian representation, and a physics-driven dynamic model.

Wenhua Wu, Huai Guan, Zhe Liu, Hesheng Wang

Published 2026-02-26
📖 4 min read☕ Coffee break read

Imagine you are a director filming a movie about a busy city street. You have a perfect camera recording the scene on a sunny day. But now, you need to show what that same street looks like during a heavy downpour, a blizzard, or a thick fog.

In the past, trying to do this was like trying to paint a new sky onto a photograph. If you just used a photo editor, the rain might look fake, the cars might warp into weird shapes, or the puddles might appear in the wrong places. It was like trying to change the weather in a painting by smudging the paint—you lose the details of the original picture.

WeatherCity is a new "magic tool" that solves this problem. Instead of just editing a flat picture, it builds a 3D digital twin of the city first, and then lets you change the weather inside that 3D world.

Here is how it works, broken down into simple concepts:

1. The "Lego City" vs. The "Painting"

Most old methods treated the video like a 2D painting. If you wanted to add snow, you just painted white over the cars. The problem? The snow didn't stick to the car's shape; it just sat on top, and if the car moved, the snow looked like it was sliding off a flat sheet of glass.

WeatherCity builds the city out of millions of tiny, invisible "digital Lego bricks" (called Gaussians).

  • The Structure: These bricks know exactly where the road, the buildings, and the cars are in 3D space. They hold the "skeleton" of the city.
  • The Skin: The tool separates the shape of the city from the look of the weather. Think of it like a mannequin (the city structure) and a wardrobe (the weather). You can swap the wardrobe from "Sunny" to "Snowy" without ever changing the shape of the mannequin.

2. The "Smart Translator" (Text-Guided Editing)

You don't need to be a graphic designer. You just type a sentence, like "Make it rain heavily on this street."

  • The system uses a powerful AI (like a super-smart translator) to understand your words.
  • It looks at your original video and says, "Okay, I see the street. I will keep the buildings and cars exactly where they are, but I will change the lighting and atmosphere to match 'heavy rain'."
  • This ensures that if a car was red in the original video, it stays red in the rain. It doesn't accidentally turn the car blue or make it disappear.

3. The "Physics Engine" (Real Rain and Snow)

This is where WeatherCity gets really cool. Other tools might just paste a static image of rain over the video. But real rain moves!

  • The Particles: WeatherCity creates thousands of tiny, invisible "digital raindrops" and "snowflakes."
  • The Physics: It gives them rules. Rain falls straight down but gets pushed sideways by the wind. Snowflakes flutter and drift.
  • The Result: Because these particles are part of the 3D world, they interact correctly. If a raindrop falls behind a car, the car blocks it. If it falls in front, you see it clearly. It creates a realistic "depth" that flat editing can't do.

4. The "Foggy Window" (Fog Simulation)

Fog is tricky because it gets thicker the further away you look.

  • WeatherCity uses a scientific rule (called the Beer-Lambert law) to simulate this.
  • It calculates how much light gets blocked as it travels through the "digital fog."
  • The result? Distant buildings fade away naturally, while the car right in front of the camera stays clear. It looks exactly like looking through a real foggy window.

Why Does This Matter?

Imagine you are training a self-driving car. You want to teach it how to drive in a blizzard, but you can't wait for a blizzard to happen in real life, and it's dangerous to test on real roads.

  • Before: You had to hope the weather would cooperate, or use fake, low-quality simulations that didn't look real.
  • With WeatherCity: You can take a video of a sunny day and instantly turn it into a dangerous blizzard or a heavy fog. You can test the car's AI in thousands of different weather scenarios without ever leaving the lab.

In a Nutshell

WeatherCity is like having a time-traveling weather machine for video. It takes a real city, builds a 3D model of it, and lets you dial up the rain, snow, or fog with a text message. It keeps the city looking real and consistent, while adding physics-based weather that moves and interacts with the world just like the real thing.

Get papers like this in your inbox

Personalized daily or weekly digests matching your interests. Gists or technical summaries, in your language.

Try Digest →