Saturday, April 25, 2020

Procedural City: Room Decorations

It's time to add more decorations to the rooms in my procedural buildings. The walls and floors look too plain, so that's a good place to start. First up is office buildings. I added hardwood floor textures to houses, but office buildings are still using simple gray floors. I downloaded five office carpet textures and added them to the pool of randomly selected floor textures for non-house buildings. Here is an example.

One of the new office building carpet textures. This one even has a normal map!

Next up are houses. The hardwood floor textures look good, but I need to add some variety. How about fancy rugs covering some of the floors? I found five rug textures online, and added support for selecting random rugs and placing them in some rooms. Rooms that are otherwise empty (no table or chairs) have a higher chance of getting a rug so that they're not totally empty. These rugs are randomly sized and positioned randomly somewhere near the center of the room.

A house with rugs on the floors of some rooms.

That's good for floors, so it's time to work on the walls. The most obvious decorations to add to walls are framed pictures, and what better pictures to use than my library of 3DWorld screenshots. I added ten full resolution 1920x1024 screenshots of various scenes including nature, reflections, a procedural universe, and buildings. I even threw in one of my skybox images, which wasn't created by me. (Note that these images are large and haven't been added to the 3DWorld GitHub project.) Here is a room with a nebula picture on one wall and a reflective Sponza Atrium picture on the the other wall.

A house with pictures on the walls and rugs on the floors.

But wait! Why limit pictures to existing images? Why not allow the player to create their own photos for hanging up on the walls? Sure, let's do that, just for fun. I added a hotkey to take a screenshot and add it to the building photo library, which replaces the initial screenshot images. Now the user can populate rooms with their own selected artwork.

A picture showing a screenshot of buildings taken a few seconds earlier, complete with "yellow status text of authenticity". Hey, dude, get out of the photo!

In fact, you can take screenshots of these pictures and create new pictures. The nesting of screenshots is unlimited.

A picture showing a screenshot of a picture of a screenshot. I had to shine the flashlight on it because it was somewhat dark due to the downward direction of the room light.

Here's another example.

A screenshot I took of the nearby city placed as a picture on the wall.

That's good for houses, but office buildings don't have many pictures on their interior walls. What they do have are offices, and many offices have whiteboards. I can conveniently reuse the picture system with a blank white texture and an added ledge to form a whiteboard.

An office with a whiteboard on the wall and a new carpet texture.

That's good enough for now. This was actually a pretty easy task, at least compared to some of the other work I've done on buildings such as navigation for people. At this point it's unclear what I should add next. I suppose more room furniture and smaller details is always useful.

Monday, April 6, 2020

Procedural City: AI Navigation for People in Building Interiors

I've continued to work on procedural city building interiors over the past few weeks. Let me start with something simple, then get into AI navigation later. I added more ceiling and floor textures, for example hardwood floors for houses. This improves the look of all my future indoor house screenshots. These changes also make it more pleasing to debug the wayward AI by watching people walk around for several minutes at a time. At some point I'll go back and add better textures to office buildings as well. Here are some examples:

House with normal mapped light hardwood floor, and ceiling tiles.

House with dark hardwood floor, and a plaster/stucco ceiling texture.

Next, I added animated people who use AI path finding to navigate between rooms of a building. As expected, this was quite time consuming and difficult to get right. It was more difficult than path finding for city pedestrians to avoid buildings and other objects, but it was probably easier than pedestrian street crossing and car avoidance logic. I haven't finished the logic for people to ascend/descend stairs, so for now they stay on a single floor of their building.

People don't exit buildings either. At least, they're not supposed to. When I first implemented moving people it didn't work as expected. (It rarely does.) What I had were people who walked straight out of their buildings, through the sky, to the center of the map. Like this:

Failed attempt at people path finding AI. The all walk out of the buildings and into the sky in the center of the map.

Yes, the 3D models still have their arms out. I haven't fixed that yet.

I eventually got the basic movement working after many hours of work. People slide across the floor and sometimes walked through walls but at least stayed in their buildings. That's progress, but the whole "slid across the floor" situation could be improved. Yes, I have to make my 3D person model animations work in this context. Here I'm using a different shader that supports indoor room lighting, rather than the city shader I was using for pedestrians. The same shader is used for building interiors and people so that I can draw both groups per-building without switching shaders. It already has animation support that I can enable because it shares a base class with the city pedestrian shader. Surely it will work if I just enable animations, right? Let's see...

Accidentally animating buildings using the animation system meant for models of people.

Yeah, I was afraid of that. The building interiors are "animated" as well; meaning, they rotate in some odd way around some random point of the building that the animation system detects as the "legs". That screenshot doesn't really do it justice. In reality, the buildings are all spinning around madly so that you can't even tell what's going on. It's just a mess of polygons flying around that you can only make sense of with a static screenshot. In fact, you can't even see the people through all of the building walls. Well, it makes for an interesting image anyway.

One hack that fixes it is to always set the animation time to zero before drawing buildings. That works well enough for now. Onto the problem of people walking through walls and furniture.

Wait, there's one more thing I need to go back and fix. I quickly realized that halfway open doors would be a problem. I never bothered to include them in collision detection with the player/user because they generally got in the way and prevented the player from walking around in some rooms. There were too many rooms where the combination of a table + chairs in the center plus doors on each side made the room impassible to the player. The AI collision detection ignored doors as well, for similar reasons.

How does this work in the real world? Most of the time, people will open doors as far as they can be opened to get them out of the way. That's either opened 90 degrees until they hit a perpendicular wall, or almost all the way to 180 degrees until they hit the wall containing the doorway. I can do the same here. The room generation and door placement algorithm can determine how far the door can be opened, and open it just short of that value. 15 degrees seemed to work well; large enough to get them out of the way, but small enough that they (or rather their non-existent doorknobs) don't intersect the wall. Now doors are usually open like this:

A man walking into a room through the doorway. The door has room to fully open, so it stays out of the way. Compare this to the 90 degree open doors in the two screenshots at the top of this post.

That's much better. People hardly ever walk through doors now! But they walk through furniture and stairs all the time, and occasionally still walk through walls. And each other. I need to implement path finding through the room graph with collision avoidance for furniture. The first step is to construct a graph of the room connectivity through doorways and stairs. An added bonus is that I can now use this graph to perform connectivity analysis and find all of the buildings where the rooms aren't 100% connected to each other. I can print out their locations for debugging and go find them in the scene.

Okay, that was a lot of work. I thought the floor planning algorithm was pretty good, but apparently there were several bugs that were rare enough that I missed them in my manual building inspection. (I can't quite manually inspect 14K buildings.) Some house cubes/rooms were too small for doors. Some building parts came together at odd positions where walls blocked doorways. Some of them were missing vertical connector stairs that joined stacked floorplans. I was able to fix all but this last category. I have it down to 28 failed placements out of about 14,000 buildings. That's what, 0.2% failed floorplans? Good enough for now. I know where they are if I want to go back and fix them later. These people are asking me to make them stop running into things, I can't make them wait forever.

Right. Finally, after all these changes, I'm ready to start on AI navigation and path finding. I expected this to be interesting, and it was. I copied my A* (A-star) path finding algorithm from the game mode smiley AI waypoint navigation code and made it work with the building room graph. After many many iterations I had a system where my 50,000 people could walk from their current position to the center of a different room in the same building through doorways without walking into walls. It's fast too, only around 3ms for 50K people! If you're interested, the code can be found here in my 3DWorld GitHub project.

Two people of different sizes walking through a room to an adjacent room.

The first part of the algorithm finds the high-level path from the starting room to the destination room by traversing the graph through rooms and doorways. A valid destination point is chosen randomly within the destination room so that it doesn't intersect any placed objects. Once the destination is reached, the AI waits there for a few seconds and then selects a new room to walk to. That part didn't take too long to get right, but it took a long time to smooth the path and push it away from walls and door frames so that people didn't partially intersect them. I had to use a collision radius for each person and shrink room boundaries by that value. This must be done for individuals because some of them have different sizes/radius values.

The next step is planning a valid path from one doorway to another across a room. I started trying to write an A* algorithm for this, but eventually I gave up on that approach. It didn't work and was too difficult to debug. The solution I went with involved picking one or two random points within the room that didn't intersect any geometry. Then the path finder tried to connect the two doorway endpoints with one or both of these points to form a valid path that didn't intersect any of the objects. The shortest valid path was used as the final solution. If there was no path, the path finding failed and the person stood there for one second. After that time was up, the path finding algorithm started again with a new random destination and path.

Since failures seemed to be rare, this kept everyone moving. I couldn't find anyone who seemed like they had somewhere they could go but weren't moving. The only people who couldn't move were those placed in single room houses or in a corner of a room blocked by a table with chairs. My fixes for these cases were to not place people in garages and sheds, and to allow them to walk through geometry on the first segment of their first path. In theory, it shouldn't be possible for them to get stuck again. If they got into that spot, they should be able to get back out (unless pushed by another person).

That handles collisions with static objects. The next task was preventing people from walking through each other like they're ghosts. This was a bit more difficult. People are dynamic, so they can't be included in the path planning system, which determines the entire path ahead of time for each new destination. What I did here was to check for collisions with every person in the same building with a higher ID than the current person. These other people have "priority." The new position of a person is determined by adding their velocity times the elapsed time to their current direction vector, which is a smoothed version of the direction to the next path point. If the new position collides with another person, then no movement is made. If the existing position also collides, then the target person is moved along the collision vector away from the collider. This applies to people who are standing still waiting to choose a new destination as well.

That somewhat works, but it introduces a new problem. People can now push each other into furniture, through walls, and even outside of buildings. I fixed this by clamping their locations to the valid walkable areas of each room. This results in some model intersections, but it's better than intersections with room and wall geometry. I had the room path finding algorithm ignore a person's starting position if it was inside room geometry to allow them to escape once pushed inside tables and chairs. This situation is relatively rare, so it's unclear if my solution always works.

I could go on and on about this system, but I think that's enough details for now. I'll try to add some videos later. There are still some finishing touches I need to add first.

Here is one final image showing the wireframe of a building with people walking around inside of it.

Wireframe view of buildings with people walking inside them.

There is a lot more work to do. I would like to have people ascend and descend stairs to reach different floors of a building. I did some experiments with stairs where people are teleported up or down one level. It works, and the path finding can handle it, but it doesn't look good. I need to figure out how to make them walk up and down the stairs. The hard part here is that they need to start at different ends of the stairs depending on whether they're going up or down, and that's not something I can easily add to the room graph. Maybe I need two different graphs, or two sets of edges corresponding to "up" and "down"? I'll continue to improve it later.

There's one more fun thing I should mention. At one point I occasionally saw people who stood there spinning in circles forever. It took me a long time to figure out why they had this odd behavior. After adding lots of debug code and printouts, I realized they were trying to stand on a very exact spot on the ground. Each time they took a step they overshot their target, then turned and tried to step in a different direction. Their minimum turning radius made sure they could never quite get to the correct spot. If they ended up a very specific distance from the target it caused them to spin in circles. The fix for this was to add a term that decreased their step length when their target position wasn't directly in front of them. This reduced their speed while turning and made their steering more accurate at the ends of their paths.