Friday, October 21, 2011

Software Engineering So Far: A Midterm study guide

Halfway into the semester and we've covered a sizable amount of content with respect to Software Engineering. It's interesting to see how early on we learned about coding standards and how it has come full circle when it comes to dealing with build systems and configuration management. Without some kind of standard going on, all there ever will be is confusion and frustration. Below are five sample midterm questions I believe are important when it comes to what we've learned so far in the semester. None of the questions are cherry picked as I believe all the material is important in one way or another.

The following are five questions and answers about the topics I've learned up until this point.

1) What is Ant?

Ant is a scripting tool that lets you construct your build scripts in much the same way as the "make" tool in C or C++.


2) Ant properties are immutable. Explain what this means and give an example.

Once Ant properties are set they cannot be changed within the build process. For example,


MyProperty = ${MyProperty}

MyProperty = ${MyProperty}

Will print the following:

[echo] MyProperty = One
[echo] MyProperty = One


3) When deploying new software configuration management tools it's always good practice to use common build tools. Why?

Much time is wasted when a developer cannot reproduce a problem found in testing, or when the released product varies from what is tested. Ensuring that everyone working a project is using the same tools will make tracking down a problem easier.


4) List and explain what a software review exposes that testing does not.

Various answers include:

Reviews are pro-active tests. Find errors not possible through testing such as an unimplemented requirement.

5) ICS-SE-Java-2 says not use the wildcard "*" in import statements. Why?

You must explicitly import each class that you use from other packages. This is an important form of documentation to readers of your class. It doesn't really make sense for a program to throw in the kitchen sink when solving a problem.

Thursday, October 20, 2011

Configuration Management and Google Project Hosting

For a couple months now, I've been using my Dropbox account to keep track of my work. When I'm at home I use my Windows machine since I prefer to program on a 24' screen. And when I'm away I have to use my 13' Macbook. I simply do not like emailing myself source code because of the potential overhead it creates on my email account. Early on I also found how it wasn't a very good idea to set my Eclipse IDE workspace as a folder that would sync with Dropbox since it would have been a hassle if I really screwed something up. So this whole idea of Configuration Management coupled with Google Project Hosting is an alternative to my current system of simply copying and pasting code as I transfer work from one machine to another and more.

As a task, I've set up my robocode-gja-shootnscoot system at Google Project Hosting. Although it took quite a bit of time to accomplish all tasks, I feel that it was a good learning experience. In terms of difficulty, I actually found that writing the wiki pages will take the most time and effort since I can definitely see people getting confused on how to run a foreign system when all they know is how to click the "Run" button on their favorite IDE. One thing in particular I've noticed when working with my peers on concurrent revision commits is that there is a potential of a bottleneck happening when too many people are trying to commit their changes to the system. But the occurrence of this type of thing happening constantly seems minimal when the amount of people working on a project is smaller.

Overall, I feel that configuration management is a powerful tool that allows people to collaborate more effectively as opposed to just relying on some kind of mailing list. It allows the group to keep track of progress on a given project and makes available the system for everyone to improve.

The ShootNScoot system hosted on Google Project Hosting: https://code.google.com/p/robocode-gja-shootnscoot/

Tuesday, October 11, 2011

ShootNScoot: A Competitive Robot

As the name implies, the ShootNScoot robot is roughly modeled on the standard military tactic of Shoot-and-scoot. Although not as elegant as the actual tactic described in the link, ShootNScoot displays the basic strategy of firing at a target and then immediately moving away from the location where the shots were fired.

The basic design of the robot is as follows:

Movement: Using a random number generator, a firing position is generated in the form X, Y where the robot will move to at the beginning of each turn. This deviates a bit from the Shoot-and-scoot tactic by initially moving then firing, however, by the start of the second turn the robot should exhibit normal Shoot-and-scoot behavior by 1) Firing at an enemy and 2) Moving from the previous firing position to a new firing position in an attempt to evade enemy fire.

Targeting: This robot employs a sort of hit-and-run tactic by attacking various targets instead of focusing on just one robot. This is in the spirit of Guerrilla warfare which goes along with the Shoot-and-scoot strategy.

Firing: Sticking with the Shoot-and-scoot philosophy, this robot will not shoot unless it is fully stopped. Also, depending on distance to target, bullet power is varied to increase the chance of a successful hit. Ideally, the robot should be able to fire while moving given an enemy with low energy.

Pitting ShootNScoot against eight sample robots show weaknesses in my Shoot-and-scoot implementation. The following are one vs. one battles over five rounds with the score percentage in parentheses.

ShootNScoot(10%) vs. Walls(90%)
ShootNScoot(42%) vs. RamFire(58%)
ShootNScoot(26%) vs. SpinBot(72%)
ShootNScoot(81%) vs. Crazy(19%)
ShootNScoot(78%) vs. Fire(22%)
ShootNScoot(46%) vs. Corners(54%)
ShootNScoot(31%) vs. Tracker(69%)

There were a couple of close ones, but Walls by far will always dominate ShootNScoot. Given that Walls is always on the move, the only reliable way to win against Walls would be to fire ahead of the robot to score a successful hit. Another robot with a high win percentage is SpinBot which also employs a similar tactic to Walls by constantly moving in circles while firing at the same time. This allows SpinBot to not only evade random fire from ShootNScoot but to successfully return fire. In most situations, the design of ShootNScoot didn't work because too much time is spent moving around and not shooting back. Although the idea of moving to a randomly generated position seems sound, getting hit while moving to that position doesn't help the robot at all. To improve the design of my robot, I would have improve it's firing system by allowing for ShootNScoot to fire while moving to a new position. Also, to somehow predict enemy tactics by storing enemy information and adjusting robot tactics on the fly.

In terms of testing, two acceptance tests were created that verified that ShootNScoot can reliably beat the SittingDuck and Fire robots. The choice of SittingDuck was a no-brainer since this served as a basis for additional testing and to simply show that ShootNScoot does not spontaneously combust upon the start of battle. Out of all the other sample robots, ShootNScoot showed it was able to win against Fire going 10+ rounds. Finally, various behavioral tests were created to verify that the following components of ShootNScoot are working as intended:

1) Generation of random position.
2) Movement to randomly generated coordinate.
3) Variable firing system.
4) Target acquisition.

With regard to software engineering, this project has taken the Three Prime Directives head on and exposed me to what it takes to ensure that these directives are met. I feel that if I had written the test cases before the actual robot implementation, the ShootNScoot would have come out a little better. The reality is that the test cases were developed as a response to the robot implementation and my testing reflects that. Automated quality assurance tools like PMD and FindBugs picked up on little things in my source code that I would have otherwise missed. For next time, I would have definitely focused more on testing before implementing the robot. This way, testing would have exposed poor design choices against other robots and would allow me to adjust my design specifications to create a more powerful robot.