Dungeon Restocker was a prototype created for the first phase of our Champlain Senior Capstone. It was made using Unreal Engine 4 and was unfortunately never released. The idea of the game was that you were a middle manager of a heroic dungeon; placing minions, resetting traps and filling chests in order to please the Dungeon Master, your omnipotent boss. An AI Hero would then travel through your deungeon, and the middle manager would rate your performance based on how challenging your dungeon was.
Having worked on a small team, most of my work was very generalist, switching around wherever needed, however, the majority of my time was probably spent on the AI for the Hero that ran the player’s Dungeon. I also made some of the smaller UI elements with Unreal’s UMG and Widgets.
The Hero’s AI was very iterative and cautious, as we were aware of the risk of having to build an AI for a short-term project that wasn’t just a basic seeker. So the first task was to make the Hero able to navigate the Dungeon on his own. At first, we wanted the Hero to use Unreal’s Environmental Query System (EQS) to use realistic sight to find his way through, but quickly dismissed the idea as far too complex. Instead, we created nodes that the designers would have to place around the map to guide the Hero. He would check for the nearest points of interest and discard any ones that he couldn’t see, so he didn’t pick up on chests through walls (etc. etc.).
This allowed the AI to stumble his way through the map, eventually. Later on, we had him acquire all minions and chests as well, in a group we called Dungeon Restocker relevant items. The Hero would prioritize nodes based on how important they were to him (measured with a certain importance float) or by other factors, like if the Hero had low HP, he would attempt to avoid minions as opposed to fight them.
Once the hero picked a target, his Behavior Tree would choose an appropriate response to whatever type of object was picked as his target, which was stored in his blackboard. This could range from engaging in combat to just walking to an invisible node.
The AI was mostly functional, albeit not incredibly complex, but I was very proud of it for something that was hastily put together just for a prototype. I actually liked most of what was happening, and what I didn’t like was mostly what wasn’t happening. I would have like to see a more complex response from the AI in terms of combat. It’s combat routine was essentially that if it was in combat, it would seek new combat targets first before checking the rest. This meant that in the case of the last target, it would check targets twice, assembling the list of potentials twice. There certainly could have been a solution that handled that edge case better. As a whole, I’m mostly unhappy with nit-picks like that.