I would like it to be possible to give some hints to the model by providing maps showing where roads, buildings, vegetation should be and let it work its magic. Maybe take it further by procedurally generating buildings based on their footprints and building type, like hospitals , schools etc, as identified in maps. That way we can have racing sims in the roads we are familiar with. And much more…
There exist maps derived from satellite images called Land Use Land Cover maps that categorise each pixel into predefined classes like built up, forest, water, roads etc. Granted that these days semantic segmentation is used to generate such maps but traditional image processing and digitization has always been used traditionally. There is no AI involved in using them for the purpose I mentioned though. Gaming engines like unity have built in tools as well as add-ons like Gaia that have cool procedural generation features. Using such maps in conjunction could help in creating realistic and familiar worlds and this blender tool gives me much hope.