Wednesday, December 14, 2011

And Now for Something Completely Different: Hale Aloha CLI, Version 2

Arriving at the point in software engineering where one has to work off a different code base is a daunting task from the get-go. So far in software engineering, my group consisting of Branden Ogata and Jason Yeo, were tasked with creating a simple command-line interface application that interacts with the WattDepot server monitoring energy usage in the Hale Aloha residence halls. Next, we performed a technical review on another group's system that had similar behavior to our system. Now, we continue our work in issue driven project management by experiencing the three prime directives of software engineering with respect to the third directive:

3. An external developer can successfully understand and enhance the system.

Although we tried to make every effort to meet in-person, our busy schedules prevented us from meeting on a regular basis. So what worked for us was to collaborate through both Google Docs and a single email thread where we kept track of issues. For this particular assignment, the SmartSVN client that maintains source code caused some problems for me when other group members would submit changes to a file I was currently working on. For example, updating my local copy of a file from an updated version in the repository would insert my team members changes into my copy and I would have to go in and clean it up. Maybe I haven't fully figured out how to use this application, but the only workaround I had was to simply start fresh by checking out the newest revision and simply inserting my changes manually. Although time consuming, we would often send an email out to each other notifying of an upcoming commit to the repository.

For this final assignment, our group added the following commands to hale-aloha-cli-cycuc:

(1) set-baseline [tower | lounge] [date]

This command defines [date] as the "baseline" day for [tower | lounge]. [date] is an optional argument in YYYY-MM-DD format and defaults to yesterday. When this command is executed, the system should obtain and save the amount of energy used during each of the 24 hours of that day for the given tower or lounge. These 24 values define the baseline power for that tower or lounge for that one hour time interval. For example, if lounge Ilima-A used 100 kWh of energy during the hour 6am-7am, then the baseline power during the interval 6am - 7am for Ilima-A is 100 kW.

(2) monitor-power [tower | lounge] [interval]

This command prints out a timestamp and the current power for [tower | lounge] every [interval] seconds. [interval] is an optional integer greater than 0 and defaults to 10 seconds. Entering any character (such as a carriage return) stops this monitoring process and returns the user to the command loop.

(3) monitor-goal [tower | lounge] goal interval

This command prints out a timestamp, the current power being consumed by the [tower | lounge], and whether or not the lounge is meeting its power conservation goal. [goal] is an integer between 1 and 99. It defines the percentage reduction from the baseline for this [tower | lounge] at this point in time. [interval] is an integer greater than 0.

For example, assume the user has previously defined the baseline power for Ilima-A as 100 kW for the time interval between 6am and 7am, and the current time is 6:30am. If the goal is set as 5, then Ilima-A's current power must be 5% less than its baseline in order to make the goal. At the current time, that means that Ilima-A should be using less than 95 kW of power in order to make its goal.

It is an error if the monitor-goal command is invoked without a prior set-baseline command for that [tower | lounge].

Entering any character (such as a carriage return) stops this monitoring process and returns the user to the command loop.

Our group was able to implement all three commands that works to specification. As of this writing, I think the overall quality of the system can still be improved. Although we were not responsible for the operation of the initial four commands the previous team had implemented, everyone on our team contributed to improving the original system if time permitted. For example, a team member reported that "output for the original commands are incorrect and at times not present". To be consistent with the three new commands in terms of look and feel, one team member modified the four original commands so that everything went together. Finally, we were also able to go back and fix some of the issues our group brought up in our technical review of cycuc's system detailed in my last blog entry.

Overall, there were some difficulties when trying to extend hale-aloha-cli-cycuc by implementing three new commands. I think that most people would have problems trying to add on features to someone else's code base and would just resort to rewriting everything from the ground up.
However, I feel that as a group with people working together, it is doable. Of course, it is helpful if you know the original developers and can just ask for their input on how they would do certain things to ease the process. Doing this project proved invaluable in allowing our group to experience firsthand what it takes to become a software engineer.

Friday, December 2, 2011

Hale Aloha CLI Technical Review

Looking back to my first blog post about Open Source Software and the Three Prime Directives, the goal of any open source software should be to fulfill these directives to ensure maximum usefulness to both end-users and external developers. To do that assignment properly, I had to find open source software that fulfilled the three prime directives which proved difficult. In my last blog post, I talked about my experiences when tasked with developing a command-line interface using the WattDepot client. This blog post will explore the process of performing a software technical review on a similar command-line interface developed by another team.

The following is the technical review my team put together while evaluating hale-aloha-cli-cycuc with respect to the three prime directives.


Review question 1: Does the system accomplish a useful task?

To answer this question, you must download, install, and run the system under review. You should exercise the system with a variety of values for each of its possible commands. You want to discover if there is any functionality that should be present that is not present. You must document what functionality is present in the system and then come to a conclusion regarding the usefulness of the system.



When we initially ran Team cycuc's .jar file there was a slight problem as it couldn't successfully run. Eventually one of cycuc's team members had updated the system for it to successfully run in console. For the most part, this system provides functionality as described in the assignment specifications. For example, the formatting of both the date and power / energy is not in the same as the sample output given.


Below is a sample run of team cycuc’s system:




For the most part, this system provides functionality as described in the assignment specifications. For example, the formatting of both the date and power / energy is not in the same as the sample output given.


> current-power Ilima-A
Ilima-A's power as of 2011-11-07 13:48:56 was 2.3 kW.
> daily-energy Mokihana 2011-11-05
Mokihana's energy consumption for 2011-11-05 was: 89 kWh.
> energy-since Lehua-E 2011-11-01
Total energy consumption by Lehua-E from 2011-11-01 00:00:00 to 2011-11-09 12:34:45 is: 345.2 kWh
> rank-towers 2011-11-01 2011-11-09
For the interval 2011-11-01 to 2011-11-09, energy consumption by tower was:
Mokihana 345 kWh
Ilima 389 kWh
Lehua 401 kWh
Lokelani 423 kWh


This may be an issue for some people depending on how they plan to process the data given. Reporting the data given in units such as kilowatt may be more desired. In the case of the rank-towers command, no units appear next to the output given which may confuse those not familiar with the system. Also, it appears that for the commands daily-energy and energy-since reports the wrong units with respect to the data given. From personal experience with the getData() method, the data returned by this method must be converted correctly to M Wh. In this case, 549 kWh should be 0.549 M Wh.

For example, the formatting under the current system is as follows:





Some of the commands do not successfully return data at all. Essentially the system attempts to implement the four commands listed in its help menu. The exact usefulness of this system is debatable as we deem this version of cycuc’s system not ready for distribution.



Review question 2: Can an external user can successfully install and use the system?

To answer this question, you should begin by carefully reviewing the project site associated the system. Does the home page provide a clear understanding of what the system is supposed to accomplished, perhaps accompanied by sample input and output? Is there a User Guide wiki page that provides details on how to download, install, and execute the system as a user? Is there a downloadable distribution that provides an executable jar file, so that users do not have to compile and build the system in order to use it? Does the distribution contain a version number so that users can easily keep track of what system they are using as it evolves over time?


In addition to containing the files for hale-aloha-cli-cycuc, the project site provides a very general idea of what the project is and does. The home page has a brief description of the system and a picture that presumably provides an explanation for the group name. This does give viewers an idea of what the system does, but not a very clear concept. There is no User Guide page; instead is a page titled “PageName” that contains most of the information that the User Guide should. The exception though is how to execute the system, which is not covered. The distribution file in the Downloads section does include a working version of the system along with an executable .jar file. The version number is included in the distribution folder name, allowing users and developers to distinguish between different versions. These version numbers include the timestamp corresponding to the time at which the distribution was created, thus letting users and developers compare versions chronologically. The numbers that actually indicate major and minor versions appear to have remained at 1.0 since the first downloads became available.


Next, you will want to exercise the system under both valid and invalid inputs, and see if the system responds in a useful way under both conditions. You must document what inputs you provided to the system, how the system responded to the inputs you provided, and then come to a conclusion regarding the "usefulness" of the system.

The tests of the system are shown below:

Valid input:



Invalid input:




Review question 3: Can an external developer successfully understand and enhance the system?

To answer this question, you should first carefully review the Developer's Guide wiki page to see if it provides clear and comprehensive instructions on how to build the system from sources. The Developer's Guide wiki page should also indicate whatever quality assurance standards are being followed for this project (automated, or "manual" (testing)), and how a new developer can ensure that any code they might write adheres to these standards. If there are coding standards, are these documented? If there is a development process that is being followed, such as Issue Driven Project Management, does the Developer's Guide explain the way that the developers for this project have implemented this process? If the system is under Continuous Integration, does the Developer's Guide provide a link to the CI server associated with this project? Finally, does the developer's guide explain how to generate JavaDoc documentation?



The Developer's Guide wiki page on the cycuc project site provides clear instructions on how to build the system in Ant. Also, the guide includes information on the automated quality assurance tools used on the project. However, specific information about those tools is not given but developers are informed that the verify task will run all of the automated quality assurance tools. A link to the formatting guidelines is provided to document the stylistic rules that the code is to follow. The Developer's Guide does not mention Issue Driven Project Management or Continuous Integration. Similarly, instructions on how to generate JavaDoc documentation are not available, though the documentation does appear to come with the project in /doc.



Next, check out the sources from SVN (read only), and see if you can generate the JavaDoc documentation. If it can be generated, review all of the JavaDoc pages to see if they are well-written and informative. Do the JavaDocs provide a good understanding of the system's architecture and how individual components are structured? Do the names of components (packages, classes, methods, fields) clearly indicate their underlying purpose? Is the system designed to support information hiding?



JavaDoc documentation, as mentioned above, comes with the project in /doc. However, developers may still generate JavaDoc files through Ant or Eclipse. The JavaDoc documentation itself tends to be well-written, though there are some questionable points and the description is somewhat sparse. Several methods lack descriptions in their JavaDoc documentation. There are a few contradictions within the documentation, as in CurrentPower.java where the description for the printResults method (line 28) indicates that the text printed is based on days[0] while the parameter tag for days (line 31) states that days is ignored. However, the JavaDoc documentation did show the organization of the system, and the names of the various components were well matched with their actual purposes. The system does appear to have been designed to implement information hiding, with the Command interface serving as an example.



Next, see if you can build the system from sources without errors. See if you can generate coverage information regarding the system. Next, review the test case source code to see how the current developers are assuring the correctness of the functionality of the system. By combining information from the coverage tool with review of the testing source code, you should come to a conclusion about how well the current set of test cases can prevent a new developer from making enhancements that break pre-existing code.



The cycuc system builds without errors in most cases. A timeout while attempting to access the server will cause the entire build process to stop, which accounts for the instances in which the build fails. Aside from timeouts, the system builds properly.

The data that Jacoco provides concerning test coverage does induce some slight concerns about the validity of the testing. The halealohacli package has no testing at all. Testing on the halealohacli.processor package covers 67% of the code and 58% of the possible branches. For halealohacli.command, 94% of the code was executed in testing, while 59% of the branches were taken. (These values seem to vary upon repeated testing; this may be due to the aforementioned timeouts.) These low values for branch coverage in particular may stem from a lack of testing for invalid input. As a result, none of the exceptions are checked. The tests indicate that parts of the system work for a particular input; however, as there is only one test per test class (with the exception of TestProcessor) it is difficult to be certain that the system does behave correctly. Thus, the existing testing will not necessarily stop new developers from breaking the system; the testing ensures that developers cannot treat valid input incorrectly, but does nothing to stop invalid input from causing problems.



Now read through the source code, and check to see if coding standards are followed and if the code is commented appropriately. Is the code easy to understand, and is there neither too much nor too little commenting?



With regard to coding standards, there exist several minor deviations from the standards that do not affect the readability of the code. The amount of comments varies: at times, there is a comment explaining every line of code, while at other points there are entire blocks of code without any documentation. The deviations from the coding standards are provided below:

EJS-07: Include white space.
There is a lack of whitespace in the test methods of TestRankTowers, TestDailyEnergy, and TestCurrentPower.

EJS-13: Capitalize only the first letter in acronyms.
HaleAlohaClientUI class capitalizes “UI” instead of only capitalizing the first letter. Admittedly, “HaleAlohaClientUi” might have been confusing to read.

EJS-29: Qualify field variables with “this” to distinguish them from local variables
In HaleAlohaClientUI:
HaleAlohaClientUI:
prompt (line 66)
isFinished:
finished (line 27)
promptForOperation:
scanner (line 130)
In Operation:
getString:
string (line 35)
In CurrentPower:
getPowerConsumed:
powerConsumed (line 24)
In DailyEnergy:
printResults:
energy (line 44)
In TestProcessor:
testGetSource:
processor (lines 37, 47, 59, 60, 75, 80, 81)

EJS-30: When a constructor or “set” method assigns a parameter to a field, give that parameter the same name as the field.
In Processor:
setSource (line 135)
setTimestamp (line 81)
Note though that in both of these cases the methods are not actually setting the field to the parameter value.

EJS-31: Use uppercase letters for each word and separate each pair of words with an underscore when naming constants.
In HaleAlohaClientUI:
prompt (line 34)
In Operation:
quit (line 10)
help (line 12)
currentPower (line 14)
dailyEnergy (line 16)
energySince (line 18)
rankTowers (line 20)

EJS-33: Keep comments and code in sync.
In HaleAlohaClientUI:
“When we have the processor class implemented...” (lines 38-39)
The Processor class is already implemented as of this writing.

EJS-35: Use documentation comments to describe the programming interface.
In HaleAlohaClientUI:
JavaDoc comments were used repeatedly where single-line comments would have been preferable.

EJS-53: Provide a summary description for each class, interface, field, and method.
In HaleAlohaClientUI:
isFinished (line 26)
In DailyEnergy:
getEnergy (line 48)
In EnergySince:
getEnergy (line 49)
In RankTowers:
rankTow (line 56)
In Processor:
getTimestamp (line 179)
getBeginningTimestamp (line 187)
getEndTimestamp (line 195)

ICS-SE-Java-6: Format JavaDoc summary lines correctly.
In TestDailyEnergy:
test (line 22)
The first “sentence” in the JavaDoc documentation is “1.” This does not adequately describe the method.

Overall though, the code is readable; admittedly, the person testing the code had already implemented the project for a separate group and thus might be familiar with the objectives of the code, which would affect the results and opinions of the tester.



Next, check the Issues page associated with this project. Is it clear what parts of the system were worked on by each developer? If an external developer had a question regarding a certain part of the system or a certain aspect of its behavior, could the Issues page be used to determine which developer would be the best person to ask? Does the current system appear to result from approximately equal input from all of the developers, or did some developers appear to do much more than other developers?



Looking through the Issues page associated with this project, it is clear what parts of the system were worked on by each developer. This team utilized a variety of status options available to better inform an external developer what worked and what didn’t work with respect to project progression. In some cases, clarification in the form of comments show the decision making process this team used when dealing with issues. Since each issue described clearly explains what the task was, it should be easy for an external developer to determine which developer would be the best person to collaborate with. In terms of work input from all of the developers, it appears that some team members did more than others.



Now check the CI server associated with this project. Apart from Nov 22-24 when there were known outages, did any build failures get corrected promptly? Was the system worked on in a consistent fashion? Were at least 9 out of 10 commits associated with an appropriate Issue?



Turning to the CI server associated with this project, it appears all build failures were corrected promptly with a maximum latency of roughly 30 minutes. Also, looking through each successful build, this team showed that they were working on this project in a consistent fashion where at least 9 out of 10 commits associated with an appropriate Issue.

Tuesday, November 29, 2011

Hearts of Darkness: A Programmer's Apocalypse

Group work. For the most part working with other people can be an unpleasant experience. Often times one finds themselves carrying the group for whatever reason, but in this instance I can gladly say this wasn't the case. After getting our feet wet with the WattDepot API, my peers and I were split into groups to apply everything we've learned so far in software engineering (Ant, Google Project Hosting, Jenkins, etc..) to develop and maintain a small application that will display various energy usage in the Hale Aloha residence halls.

Taken from the help command in our program, the following were to be implemented:

current-power [tower | lounge]
Returns the current power in kW for the associated tower or lounge.
daily-energy [tower | lounge] [date]
Returns the energy in kWh used by the tower or lounge for the specified date (yyyy-mm-dd).
energy-since [tower | lounge] [date]
Returns the energy used since the date (yyyy-mm-dd) to now.
rank-towers [start] [end]
Returns a list in sorted order from least to most energy consumed between the [start] and [end] date (yyyy-mm-dd)

With three people in the group, we decided to split up the tasks as follows:

Input Parsing - To validate user input and make sure that it makes sense.
Program Processes - Takes user input and retrieves information from the WattDepot servers.
User Interface - Provides a simple command-line interface that allows for user interaction.

The assignment of tasks was deliberate as to cater to our strengths and weaknesses. In this case, only one of us completed all WattDepot katas in the previous class assignment.

Initially, there were some things to get used like creating an issue in Google Project Hosting that describes the task at hand. For example, the tasks above were broken down into smaller subtasks that are intended to be completed in 1 - 2 days. As detailed in the Issue Driven Project Management lecture, we tried to adhere to some rules that would lead to a successfully managed managed project:

Divide the work into tasks.
No task takes longer than 2 days.
Each task is specified by an issue.
Each issue has a single owner.
At all times, every person has an open task that they are responsible for completing.
Every commit specifies an issues in its log comment.

However, there were some issues with respect to these goals. Because of the way we subdivided the tasks, there were times in the project where I was blocked from completing my task. Working with the user interface, I described to a member handling the parsing portion of the program how I intend to send information inputted from the user to their package of commands. From there, he would parse user input and check the validity before passing it on to our team member handling the processing of data. Finally, the output returned by the WattDepot API would trickle back to me where it is finally outputted to the screen. Although we could have broken up the Processes package into smaller tasks, one group member felt confident that he would be able to handle the four commands without issue. This same group member happened to be the most talented of our group with respect to Java. Although we expressed an interest to create our own JUnit tests for each package, he was able to churn out the JUnit tests for each command in the processor package like he had been doing it all his life. This was not in vain, however, because when it came time to put everything together, I often had to go back into each test and verify that everything was working properly since we had some cross-platform issues regarding Ant and the verify task.

Instead of meeting regularly, we decided on meeting regularly online once a week over Google Docs and using the document feature to take notes that everyone could see. This worked out pretty well since this sort of acted as a virtual white board for us to hash out ideas. We also met twice a week before class to clarify anything that needed to be addressed, and we also used standard email as a message thread between the three of us to collaborate asynchronously. This allowed our group to be up-to-date on the issues going on with the project and thus actually worked out pretty well. When someone would finish a task, the issue of "What now?" would often come up and we were able to quickly find things to work on despite being trivial in nature. For example, while two people were finishing up with the Java portion of the project, another would be administering the Google Project Hosting site with regard to wiki pages and evaluating the issues and updates page to see if it met the class standards. This kind of work ethnic employed by our group ensured that we were always on top of things despite the rocky start with getting used to the whole process of project management.

In terms of what we accomplished, we were able to implement all commands described above with little issue.

From our Google Project Hosting site:

https://code.google.com/p/hale-aloha-cli-teams/

This is a command line interface that allows users to view various information about power and energy consumption in the Hale Aloha residences on the campus of the University of Hawaii at Manoa. The current commands implemented are:

current-power: Finds the current power consumption for sources in the Hale Aloha residences.

daily-energy: Finds the energy consumption for a given source on a given day.

energy-since: Finds the energy consumption for a given source since a given date.

rank-towers: Sorts the Hale Aloha towers based on energy consumption.

Overall, I am satisfied with what we have produced. It would have been interesting to see how far we could have taken this project by extending it using Java Reflection, but we wanted to make sure we had a solid product before trying something new.

Tuesday, November 8, 2011

WattDepot Katas

Continuing off my last blog entry about energy in Hawaii, there is an energy conservation competition going on over at the first-year dorms called the Kukui Cup. The system that facilitates this competition is the WattDepot web service which collects electricity data from meters found in these dorms. To get my feet wet with the use of the WattDepot API, I implemented some of the following katas.

All katas will accept a URL to a WattDepot server that will provide energy consumption data in the Hale Aloha residence halls at the University of Hawaii. All katas will lists all sourced defined at this URL and will also display specific information per kata.

Kata 1: SourceListing
Lists all sources defined at a given URL.
Kata 2: SourceLatency
Lists the number of seconds since data was received sorted in ascending order.
Kata 3: SourceHierarchy
Show the hierarchy of all subsources of a given source.
Kata 4: EnergyYesterday
Lists amount of energy in watt-hours consumed during the previous day for each source.
Kata 5: HighestRecordedPowerYesterday
Lists highest recorded power associated with a source during the previous day.
Kata 6: MondayAverageEnergy
Lists average energy consumed by a source during the previous two Mondays. Sorted in ascending order by watt-hours.

One obvious thing I've noticed about WattDepot is that it's very complex! Some of these katas seem trivial on paper, but trying to implement them was a different story. With the first kata, a simply copy and paste of an example program did half the work. But the other half involved learning the intricacies of the System.out.format method. Like the System.out.println method which simply prints data to standard output (the computer screen), System.out.format allows data to be printed in a nicely formatted list. With the SourceLatency kata, I found that the latency given by the WattDepot server may have already been sorted since printing each source with its computed latency shows that it is indeed in ascending order. To make sure, I've tried setting variables involved with computing latency to zero before computing the latency of the next source. Finally, SourceHierarchy proved interesting since the subsources given by each source was provided in a not so nice format. For example, given a source Ilima, the sub sources of Ilima would look something like this:

[http://someurl.com/subsource, http://someurl.com/subsource]

It wasn't perfect, but using a combination of Java's Split operator to break up the URI's and WattDepot's UriUtils class, I was able to extract the subsource from each subsource URI.

However, I wasn't able to complete the last three katas in time since each kata builds upon the last one. The EnergyYesterday kata requires the computation of the previous day which carries on to the fifth and sixth katas. Even though I was able to get the previous days time stamp in a nice format, I wasn't able to properly compute the energy consumed for each source. Looking at the last two katas, they both do something similar with respect to computing some previous day along with some data recorded at that day.

Finally, in terms of time spent on each kata, I found that more time was required to complete a kata than the previous kata. Not surprising since each new kata gets progressively more difficult. Overall, I found these katas enlightening in that it showed me how to approach a programming problem. Working with one of my peers last week getting started with these katas, I was overwhelmed with the amount of complexity he put into the second kata with respect to sorting by latency. There is always a brute force approach to solving some of these katas, but I like to stop and ask myself "Is there a better way?". Surely there exists some method in Java that can help solve these programming katas without having to produce spaghetti code.

Tuesday, November 1, 2011

Care for the Land: Energy in Hawaii

Having lived here all my life, I've been aware that the cost of living in Hawaii is one of the highest in the nation. There are many reasons on why energy is so expensive in Hawaii, but the main reason being our reliance on imported oil. As a result, the citizens of this state must make decisions today that will affect their livelihood for themselves and for future generations.

There are many challenges the Hawaii state government faces when having to deal with energy consumption. For example, each island in the state has its own power grid that does not allow for the transferring of energy produced from one island to another. Oahu consumes the most energy out of all the Hawaiian islands, and energy produced on other islands can be utilized on Oahu. On the mainland, one state can easily sell electricity to a neighboring state that needs it. This is facilitated by the fact that power generated on the mainland comes from a variety of natural resources such as coal and natural gas, thus decreasing the need for imported foreign oil.

To address these challenges, Hawaii's geographical location offers a huge renewable energy potential. The abundance of natural resources such as geothermal, water, and wind, allows us to exploit these resources quickly and effectively. Our small size is an advantage in that our energy consumption is relatively modest, so this would work well with renewable energy.

The Hawaii state government has made some progress to address these challenges with the Hawaii Clean Energy Initiative, where the state will try to achieve 70% clean energy by 2030. But to meet these requirements, we need to find ways to reduce our energy consumption today, and with tomorrow's technology we can expect to consume less energy than we do today. With that said, HNEI offers various devices to be installed into peoples homes that will measure energy consumption that allows for two-way communication to a database off-site. This data is then made available to a website that allows residential customers to see their energy consumption which can educate them on how to adjust their needs with respect to power usage. For example, a device called a smart thermostat, interacts with an A/C unit that shuts it off periodically to conserve energy when the temperature sample is at a comfortable level. In effect, if enough of these devices are being used then the energy savings would be huge to the customers and the amount of energy demand on the electric companies would go down. Another huge potential with this type of technology is the ability to use this data to model energy usage and if there is time during the day where there is excess energy produced as a result of lower energy demand, this type of information can be communicated to customers that would allow them to utilize this free excess energy to fulfill their needs.

Here on the University of Hawaii at Manoa campus, Kuykendall Hall is going through a building renovation to facilitate the measurement of energy usage and environmental factors such as temperature, humidity, air flow, and radiant heat. This information is used to to help engineers design buildings that are environmentally neutral where power consumption is lower as a result of smart building design that allows for better lighting and cooling. With the installation of these metering devices, there is an opportunity for software developers with an interest in clean energy to design software that can process the data collected to help people make better decisions with respect to building design.

Going forward, we can lower our consumption of energy with the help of HNEI. If demand response devices are installed into everyones homes, then that will certainly help us achieve that goal of 70% clean energy. All of this is facilitated by the development of good software to help manage data these devices measure that will allow people to make better decisions when dealing with energy consumption.

Friday, October 21, 2011

Software Engineering So Far: A Midterm study guide

Halfway into the semester and we've covered a sizable amount of content with respect to Software Engineering. It's interesting to see how early on we learned about coding standards and how it has come full circle when it comes to dealing with build systems and configuration management. Without some kind of standard going on, all there ever will be is confusion and frustration. Below are five sample midterm questions I believe are important when it comes to what we've learned so far in the semester. None of the questions are cherry picked as I believe all the material is important in one way or another.

The following are five questions and answers about the topics I've learned up until this point.

1) What is Ant?

Ant is a scripting tool that lets you construct your build scripts in much the same way as the "make" tool in C or C++.


2) Ant properties are immutable. Explain what this means and give an example.

Once Ant properties are set they cannot be changed within the build process. For example,


MyProperty = ${MyProperty}

MyProperty = ${MyProperty}

Will print the following:

[echo] MyProperty = One
[echo] MyProperty = One


3) When deploying new software configuration management tools it's always good practice to use common build tools. Why?

Much time is wasted when a developer cannot reproduce a problem found in testing, or when the released product varies from what is tested. Ensuring that everyone working a project is using the same tools will make tracking down a problem easier.


4) List and explain what a software review exposes that testing does not.

Various answers include:

Reviews are pro-active tests. Find errors not possible through testing such as an unimplemented requirement.

5) ICS-SE-Java-2 says not use the wildcard "*" in import statements. Why?

You must explicitly import each class that you use from other packages. This is an important form of documentation to readers of your class. It doesn't really make sense for a program to throw in the kitchen sink when solving a problem.

Thursday, October 20, 2011

Configuration Management and Google Project Hosting

For a couple months now, I've been using my Dropbox account to keep track of my work. When I'm at home I use my Windows machine since I prefer to program on a 24' screen. And when I'm away I have to use my 13' Macbook. I simply do not like emailing myself source code because of the potential overhead it creates on my email account. Early on I also found how it wasn't a very good idea to set my Eclipse IDE workspace as a folder that would sync with Dropbox since it would have been a hassle if I really screwed something up. So this whole idea of Configuration Management coupled with Google Project Hosting is an alternative to my current system of simply copying and pasting code as I transfer work from one machine to another and more.

As a task, I've set up my robocode-gja-shootnscoot system at Google Project Hosting. Although it took quite a bit of time to accomplish all tasks, I feel that it was a good learning experience. In terms of difficulty, I actually found that writing the wiki pages will take the most time and effort since I can definitely see people getting confused on how to run a foreign system when all they know is how to click the "Run" button on their favorite IDE. One thing in particular I've noticed when working with my peers on concurrent revision commits is that there is a potential of a bottleneck happening when too many people are trying to commit their changes to the system. But the occurrence of this type of thing happening constantly seems minimal when the amount of people working on a project is smaller.

Overall, I feel that configuration management is a powerful tool that allows people to collaborate more effectively as opposed to just relying on some kind of mailing list. It allows the group to keep track of progress on a given project and makes available the system for everyone to improve.

The ShootNScoot system hosted on Google Project Hosting: https://code.google.com/p/robocode-gja-shootnscoot/

Tuesday, October 11, 2011

ShootNScoot: A Competitive Robot

As the name implies, the ShootNScoot robot is roughly modeled on the standard military tactic of Shoot-and-scoot. Although not as elegant as the actual tactic described in the link, ShootNScoot displays the basic strategy of firing at a target and then immediately moving away from the location where the shots were fired.

The basic design of the robot is as follows:

Movement: Using a random number generator, a firing position is generated in the form X, Y where the robot will move to at the beginning of each turn. This deviates a bit from the Shoot-and-scoot tactic by initially moving then firing, however, by the start of the second turn the robot should exhibit normal Shoot-and-scoot behavior by 1) Firing at an enemy and 2) Moving from the previous firing position to a new firing position in an attempt to evade enemy fire.

Targeting: This robot employs a sort of hit-and-run tactic by attacking various targets instead of focusing on just one robot. This is in the spirit of Guerrilla warfare which goes along with the Shoot-and-scoot strategy.

Firing: Sticking with the Shoot-and-scoot philosophy, this robot will not shoot unless it is fully stopped. Also, depending on distance to target, bullet power is varied to increase the chance of a successful hit. Ideally, the robot should be able to fire while moving given an enemy with low energy.

Pitting ShootNScoot against eight sample robots show weaknesses in my Shoot-and-scoot implementation. The following are one vs. one battles over five rounds with the score percentage in parentheses.

ShootNScoot(10%) vs. Walls(90%)
ShootNScoot(42%) vs. RamFire(58%)
ShootNScoot(26%) vs. SpinBot(72%)
ShootNScoot(81%) vs. Crazy(19%)
ShootNScoot(78%) vs. Fire(22%)
ShootNScoot(46%) vs. Corners(54%)
ShootNScoot(31%) vs. Tracker(69%)

There were a couple of close ones, but Walls by far will always dominate ShootNScoot. Given that Walls is always on the move, the only reliable way to win against Walls would be to fire ahead of the robot to score a successful hit. Another robot with a high win percentage is SpinBot which also employs a similar tactic to Walls by constantly moving in circles while firing at the same time. This allows SpinBot to not only evade random fire from ShootNScoot but to successfully return fire. In most situations, the design of ShootNScoot didn't work because too much time is spent moving around and not shooting back. Although the idea of moving to a randomly generated position seems sound, getting hit while moving to that position doesn't help the robot at all. To improve the design of my robot, I would have improve it's firing system by allowing for ShootNScoot to fire while moving to a new position. Also, to somehow predict enemy tactics by storing enemy information and adjusting robot tactics on the fly.

In terms of testing, two acceptance tests were created that verified that ShootNScoot can reliably beat the SittingDuck and Fire robots. The choice of SittingDuck was a no-brainer since this served as a basis for additional testing and to simply show that ShootNScoot does not spontaneously combust upon the start of battle. Out of all the other sample robots, ShootNScoot showed it was able to win against Fire going 10+ rounds. Finally, various behavioral tests were created to verify that the following components of ShootNScoot are working as intended:

1) Generation of random position.
2) Movement to randomly generated coordinate.
3) Variable firing system.
4) Target acquisition.

With regard to software engineering, this project has taken the Three Prime Directives head on and exposed me to what it takes to ensure that these directives are met. I feel that if I had written the test cases before the actual robot implementation, the ShootNScoot would have come out a little better. The reality is that the test cases were developed as a response to the robot implementation and my testing reflects that. Automated quality assurance tools like PMD and FindBugs picked up on little things in my source code that I would have otherwise missed. For next time, I would have definitely focused more on testing before implementing the robot. This way, testing would have exposed poor design choices against other robots and would allow me to adjust my design specifications to create a more powerful robot.

Thursday, September 29, 2011

Ant Code Katas

Building upon my last blog entry on the application of Katas on improving my programming, I applied this concept to the Apache Ant build system for Java. Although not my first time messing around with a build system, this is by far the most complex I've encountered. Similar to the Make utility found on most Unix-based operating systems, the Apache Ant build system automatically builds executable programs from source code and libraries.

For this particular build system, I worked on the following katas:

1. Ant Hello World
2. Ant Immutable Properties
3. Ant Dependencies
4. Hello Ant Compilation
5. Hello Ant Execution
6. Hello Ant Documentation
7. Cleaning Hello Ant
8. Packaging Hello Ant

Sparing the details for each kata (you can read about them here), each kata gets more difficult along the way. Funny enough, learning how to print words to the screen in Ant took a bit longer than expected. Unlike a programming language, it seems like the first thing the authors of a build system would like you to do is to actually use their program to build a system. I knew I had to enclose the "echo" task around what I wanted to print to the screen but to actually set up the script to do this took longer than expected. A quick Google searched revealed resources on how to do exactly this.

Next, completing the immutability and dependency scripts were pretty painless since the Wikibooks website also provided a good tutorial on those two concepts, however, I would soon hit a brick wall with the next three scripts. Out of the three, compilation wasn't too bad since the Ant manual provides a good basis on how to do this. On the execution script I would later find out the pain in trying to run a Java program using Ant. During this time I've probably went back-and-forth a few times deciding whether to follow the Apache model and JAR my HelloWorld program and execute the the JAR, or to simply find a way to execute the program without having to do this step. Finally, I found the documentation script to be a little simpler since I only had to be concerned with the location of the source files and the directory where I want to store the Javadoc reference files. Luckily, I had no problems with the clean and distribution scripts since I found the sample scripts from another build system distribution straightforward.

With these code katas I was able to dive a little deeper with just how complex build systems can get pass the compilation and execution stages. With the headaches I've encountered when trying to run other peoples Java implementations on my computer, it is clear how useful and in my case necessary to have a robust build system to accompany any software system.

Tuesday, September 20, 2011

Robocode Code Katas

Having no prior knowledge of what a "Kata" is, I took the time to read through the following articles to get myself acquainted with the idea of a "code kata".

http://en.wikipedia.org/wiki/Kata_(martial_arts)
http://www.codinghorror.com/blog/2008/06/the-ultimate-code-kata.html

The idea is sound. The only way I can get better as a programmer is to constantly challenge myself with harder sets of programming problems. Although I often find myself in an impossible situation when trying something new, I've always come out getting something out of the experience.

Applying this concept to Robocode, an open source programming game, revealed my limitations as a programmer and how the code kata concept can work seamlessly with any programming exercise. In this case, I challenged myself to thirteen exercises devoted to teaching me the basic fundamentals of a robot in the Robocode universe.

Position01: The minimal robot. Does absolutely nothing at all.
Position02: Move forward a total of 100 pixels per turn. When you hit a wall, reverse direction.
Position03: Each turn, move forward a total of N pixels per turn, then turn right. N is initialized to 15, and increases by 15 per turn.
Position04: Move to the center of the playing field, spin around in a circle, and stop.
Position05: Move to the upper right corner. Then move to the lower left corner. Then move to the upper left corner. Then move to the lower right corner.
Position06: Move to the center, then move in a circle with a radius of approximately 100 pixels, ending up where you started.
Follow01: Pick one enemy and follow them.
Follow02: Pick one enemy and follow them, but stop if your robot gets within 50 pixels of them.
Follow03: Each turn, Find the closest enemy, and move in the opposite direction by 100 pixels, then stop.
Boom01: Sit still. Rotate gun. When it is pointing at an enemy, fire.
Boom02: Sit still. Pick one enemy. Only fire your gun when it is pointing at the chosen enemy.
Boom03: Sit still. Rotate gun. When it is pointing at an enemy, use bullet power proportional to the distance of the enemy from you. The farther away the enemy, the less power your bullet should use (since far targets increase the odds that the bullet will miss).
Boom04: Sit still. Pick one enemy and attempt to track it with your gun. In other words, try to have your gun always pointing at that enemy. Don't fire (you don't want to kill it).

My approach for this particular exercise was simple, and that was to use as little trigonometry as possible. For the Position robots, everything was fine until I got to the Position04 robot. Knowing that the robot will spawn at a random location on the map immediately caused me to retract my previous "no trigonometry" approach. But thanks to the power of Google, I found an IBM solution in the form of a sample robot called DwRotater found in the article detailing the Robocode basics.

http://www.ibm.com/developerworks/java/library/j-robocode/

DwRotater does exactly as what the Position04 specifications call for, that is, to move to the center of the playing field. I would also use the basic outline of the code and modify to send the robot off into the four corners of the map in Position05. But when it came to the Position06, the "no trigonometry" motto proved to be fruitless as I was not able to find a solution to move the robot in a circle.

The Follow robots proved simple enough by using the built-in functions to give a distance that allows the robot to close in on a particular enemy.

Finally, with the Boom robots things got interesting by experimenting with the firing mechanics of the game, Boom03 in particular because it shows how manipulating the values issued to the fire command will have on enemy tanks. Boom04 proved to also be a hard one since I was only able to track an enemy sitting still by simply rotating the gun in that direction.

With that said, I was able to complete most of the exercises by using no trigonometry whatsoever. Obviously not a smart decision since I'm sure most if not all of the elite robots are using trigonometry to do the more advanced stuff like evasion techniques or firing at the enemy on the move. To build a competitive robot I simply need to learn how to not get hit while still being able to fire effectively at targets. In general, the code kata approach helped by starting on an easy problem and gradually increasing the difficulty to get better at it.

Tuesday, August 30, 2011

FizzBuzz in Eclipse

FizzBuzz in Eclipse

The following article provides insight into the FizzBuzz problem presented to prospective computer programmers:

You Can't Teach Height - Measuring Programmer Competence via FizzBuzz

“Write a program that prints the numbers from 1 to 100. But for multiples of three print “Fizz” instead of the number and for the multiples of five print “Buzz”. For numbers that are multiples of both three and five print “FizzBuzz”.

In retrospect, I did not expect to encounter any problems when implementing the FizzBuzz application in Eclipse. Having some experience with Eclipse over the years and a copy of my in-class solution to this problem, I was confident the application would produce the correct output on the first run.

Expected output for the first 15 numbers:

1
2
Fizz
4
Buzz
Fizz
7
8
Fizz
Buzz
11
Fizz
13
14
FizzBuzz

Actual output from my implementation for the first 15 numbers:

1
2
Fizz
4
Buzz
Fizz
7
8
Fizz
Buzz
11
Fizz
13
14
Fizz

Obviously this wasn’t correct.

Without really thinking too much about the logic of the program, my checking conditions produced erroneous output as a result of the order I checked the value of number. By checking for numbers divisible by both 3 and 5 after checking for the values of 3 and 5, the application would print “Fuzz” for the values of 15, 30, etc. instead of “FizzBuzz”. By simply checking for the remainder of number and 15 as the first condition fixed this problem.



It took me 05:05.5 to complete the application with correct output, and 12:04.0 to finish writing in my comments.

Sunday, August 28, 2011

Open Source Software and the Three Prime Directives

PDF Split and Merge http://www.sourceforge.net/projects/pdfsam/


Overview: PDF Split and Merge is a simple, easy to use, free, open source application to split and merge PDF documents. There are console and GUI versions of the application where the GUI is written in Java Swing. A variety of external libraries is utilized to deal with the manipulation of PDF files, XML format, and encrypted documents.

Prime Directive 1: The system successfully accomplishes a useful task.

A simple Google search on how to merge or split PDF documents will reveal a variety of free and paid solutions for the end-user. A browser-based solution, while free to some extent, is only limited to 15 MB files and offers no split functionality. A DIY method is available but only for Mac OS X users. With that said, PDF Split and Merge allowed me to quickly merge two existing PDF documents in less than a minute. Although the interface could be a little more intuitive, patient users will find the process painless by reading the manual that offers a how-to on merging and splitting PDF documents.


Prime Directive 2: An external user can successfully install and use the system.

The PDF Split and Merge website, http://www.pdfsam.org, has installers available for Windows and Max OS X, a ZIP archive, and the source code. The tutorial from the site describes that a Java Runtime Environment is required to run the application. A detailed explanation of each function in the system is available to help the end-user complete basic tasks such as Merge/Extract and Split.

Prime Directive 3: An external developer can successfully understand and enhance the system.

A Software Requirements Specification document is available at http://www.pdfsam.org/uploads/PDFsam-SRS-v2.1.0-EN.pdf. This developer-level documentation details the basic version of PDF Split and Merge that allows external developers to understand the system and to help improve it. A feature request tracker, https://sourceforge.net/tracker/?atid=814268&group_id=160044&func=browse, has a couple dozen suggestions to improve the system’s feature set. For example, this program is currently limited to merging exactly two documents at a time. Going through the API and source code shows a lack of descriptive comments; however the methods are named in such a way to help an external developer understand the program.

Three Prime Directives of Java-based Open Source Software Engineering: