Well, I'm pretty good at cogging, and I was thinking lately, while playing Q3A and UT: Hmm, how do the bots react so human-like to real game situations? Is this possible in JK? Can we make the bots, in essance, 'think?' This got my juices churning, and I was wondering if anyone has any ideas? I was thinking of a custom movement routine that can be dynamically modified by a control cog, but I'm not too sure. Also, if anyone has any technical informtion on how the bots from Q3A and UT work I'd be more than happy to recieve it.
(Oh BTW, if anyone's interested in why I'm asking, I've modified Nightmare's AI cog (over the course of the last year and 1/2) to support low-lag adventure style play, and I wanted to put some 'umph' into the bots.)
(Oh BTW, if anyone's interested in why I'm asking, I've modified Nightmare's AI cog (over the course of the last year and 1/2) to support low-lag adventure style play, and I wanted to put some 'umph' into the bots.)