Wednesday, December 14, 2011

Uncharted Waters: Extending a Foreign System

Once again, it is time for team JCEV to tackle another CLI project for the Hale Aloha towers.  However, this time we did not design the system from scratch.  Instead, we took control of the hale-aloha-cli-kmj project that I previously reviewed and extended it by adding three new commands and updating the documentation.  So how did it turn out?  Well, as per usual I shall evaluate this new endeavor in the context of the three prime directives of open source software.

Prime Directive #1: Does the system accomplish a useful task?
To accomplish this prime directive, the system should fully implement the three commands specified.  The first command is set-baseline which records the energy used by a tower or lounge over a specified day in one hour periods and it should default to yesterday if no date is given.  In this case, the system successfully retrieves this data and stores the information in a Baseline object for later use in both cases.

set-baseline command with optional date argument.
set-baseline without the optional date argument.
The second command, monitor-power, should periodically retrieve the current power consumption of the given tower at the given interval.  The interval argument is optional and the interval between checks should be 10 seconds if not specified otherwise.  In addition, the command should stop execution when the user presses a key like enter.  This version of the system contains of the functionality as it periodically prints the results as expected.  The only small issues are that the command will not stop unless the enter button is pushed (i.e. the s and the a in the image below did not stop the command's execution) and that terminating execution causes an invalid command error.  These do not really affect the functionality though and hence this command is in working order.

The monitor-power command in action.
The third and final command is monitor-goal.  This command checks the current power consumption of the given tower or lounge and compares it to the baseline which must be set before running this command.  If the current power consumption has been reduced by the specified goal percentage or more, than it prints the goal has been met.  Otherwise, it indicates that the goal was not met.  The monitor-goal command does this periodically like the monitor-power command and should stop execution when the user presses a button like enter.  This implementation of monitor-goal suffers from the same problem as the monitor-power command as it requires the user to press enter to exit the command, but once the user presses enter, the command exits as expected.  As a result, all three commands work to accomplish a useful task and this system fulfills the first prime directive.

The monitor-goal command successfully executing after the baseline has been set.

Prime Directive #2: Can an external user successfully install and use the system?
I believe that this system fulfills this prime directive as well.  The original project was a little lacking in the user documentation due to a sparse home page and a few key errors in the UserGuide, but these have been fixed in the latest version.  The home page now sports many screen shots of example executions that users can reference and the new UserGuide removes some of the misleading instructions that afflicted the original guide (i.e. removed the unsupported sources that were previously listed).  The system is also very easy to install as there is a clearly labeled distribution to be downloaded on the website which contains an executable jar file that can be run without any compilation.  Consequently, I believe that the updated documentation should make this system easy to install and use, even for an external user.

Prime Directive #3: Can an external developer successfully understand and enhance the system?
Essentially, this entire exercise was a test of the third prime directive as my team (JCEV) had to adopt this code-base, understand how it worked, and extend it.  While we were successful in doing so, it was not an ideal experience as we had to figure out how to extend the existing system on our own.  The DevelopersGuide gave a general overview as to how the system worked, but the specifics of what changes were needed when adding a new command were nowhere to be found.  As a result, we had to skim over the code in the CommandParser class to figure out exactly what these changes were before we could add our new commands.  Of course, this was a pain and we decided to pardon any future developers from this experience by listing the required changes in the updated DevelopersGuide.  The DevelopersGuide was also updated with the coding standards that should be used in the project and some instructions detailing how to generate JavaDocs to further assist any future developers.  These changes should make it much easier for an external developer to successfully understand and enhance the hale-aloha-cli-kmj system and thus I think that it now fulfills the third and final prime directive.

Conclusion
Overall, this was an interesting experience in software engineering.  While we managed to implement all of the functionality and produced what I would consider high-quality software, it was not without its issues.  These issues mostly stemmed from trying to extend an unknown code-base with rather poor documentation as a good amount of brain power and time was spent analyzing the existing source code to figure out how to fit our new features in.  This was probably the first time in my programming experience that I had to learn what someone else's code did by reading it and it definitely emphasized the importance of good documentation.  Finally, this experience also made me realize just how essential good group members are.  When everyone did their part, the workload becomes much more bearable and the project can progress smoothly.  We did not finish with a lot of extra time like our own CLI project, but we did have enough time to double check everything to give ourselves the piece of mind that we did our best and that the product is up to our standards.  As a result, I would like to thank Eldon Visitacion and Jordan Takayama for their hard work throughout this semester.  It has been a pleasure working with them in group JCEV and I am looking forward to working with quality individuals like them in the future.

Friday, December 2, 2011

A Technical Review

Collaboration is the name of the game when it comes to Software Engineering and an important part of the collaboration process is performing technical reviews on other developers' code.  Here we shall evaluate a group's Hale Aloha CLI project where they attempted to create an open source Java command line interface that would allow users to query a server for various energy / power related information.  This was to be done using the WattDepot library which simplifies the task of connecting to the server and retrieving the data so that the developer can focus on processing that data.  As a result, we will not focus so much on the code itself, but take a look at how "good" this project is by comparing it to the three prime directives of open source software.

Prime Directive #1: Does the system accomplish a useful task?
The first prime directive is pretty self explanatory, does the system do what it is supposed to do?  Here I tested the various commands that the system was supposed to implement and checked if they worked as I expected them to.  This initial release of the system is expected to implement 6 commands: help (return a list of commands), quit (exit the CLI), current-power (returns the current power usage of given tower/lounge in kW), daily-energy (returns energy used by the given tower/lounge on given date), energy-since (returns energy used by given tower/lounge from given date to now), and rank-towers (rank towers from least to most energy used on given interval of days).  All six of these commands are present in the system and for the most part are functional, but there are also a few problems.

The first major problem is that the commands do not work with all of the advertised source values.  The UserGuide lists lounge and telco sources (i.e. "Lokelani-04-telco") as valid, yet the system rejects them claiming that they are not valid.  It works fine with all of the tower  (i.e. "Lokelani") and lounge (i.e. "Lokelani-A") aggregates though.

The system failing to recognize a telco source as a valid source.

Another problem is that help command breaks if the jar file is moved to a different directory.  The current implementation of the help command references an external text file to retrieve its data with a hard-coded path and if the jar is moved outside of the distribution directory, the help command fails to work.  This is a problem since I think that the executable jar file should be self-contained and fully usable from anywhere in the file system.

The help command failing when the jar file is moved.

Yet another issue is that invoking the daily-energy command with today's date does not work.  It does not work as the command tries to retrieve energy data between the start of given date (12:00 am today) and the start of the next day (12:00 am tomorrow) which is an invalid range since the end of the range is in the future.  While the project specifications does not explicitly state whether today is a valid input for daily-energy or not, I expected the command to retrieve data from the start of today until the time of the latest recorded sensor data instead.

The daily-energy command failing to work when given today's date.

Finally, rank-towers does not work when it is given the same day as the start and end dates.  Instead, it will print out just one line for the Lokelani tower with a value of 0.  This is not good as giving the same date should either be considered a valid output and print results for all four towers or be flagged as invalid and raise an error message to inform the user.  Thus, I believe that the current behavior of the rank-towers command when given the same day as the start and end dates is unacceptable as it does not indicate if that input is valid or not and prints an incomplete output.

The strange output from rank-towers when using the same day as both the start and end dates.

Overall, most of the functionality is there.  Outside of these cases, all of the expected commands are present and work as expected by printing out results when given valid inputs and printing error messages when given invalid inputs.  Consequently, I believe that this system somewhat fulfills the first prime directive as long as the user is careful about which inputs is used and does not move the jar file.

Prime Directive #2:  Can an external user successfully install and use the system?
This directive tests the documentation of the project from the view of an external user.  Unfortunately, the home page does not tell the user much about the system and lacks any sample input and output.  Furthermore, the UserGuide does not tell the user where to download the distribution, how to "install" the system (i.e. extract it from the zip archive), and the command to invoke the executable jar file is wrong (the name of the jar is "hale-aloha-cli-kmj.jar" not "hale-aloha-cli.jar").  On the other hand, the UserGuide does do a fairly good job at explaining which commands and arguments can be used (thought there is that error as mentioned in the the previous prime directive), the distribution is labeled with a version number to distinguish it from past versions, and the distribution does contain an executable jar that does not require any building or compilation.  However, the aforementioned problems are rather major so the system only partially fulfills this prime directive.

Prime Directive #3: Can an external developer successfully understand and enhance the system?
The third and final prime directive once again tests the project's documentation, but from a developer's view.  A major document for this directive is the DevelopersGuide which should detail how an external developer can build the system and how to extend it.  While it does detail how to build the system using Ant, it does not tell the developer how to generate the system's JavaDocs.  It also does not mention that new functionality must be accompanied by new JUnit test cases as it only mentions that additions must pass CheckStyle, PMD, and FindBugs.  In addition, the DevelopersGuide makes no mention any standards that are being followed.  The instructions for extending the system are rather vague too as it does not explain exactly what to modify in the CommandParser class to add a new command and makes does not describe how to add any new commands to the help command.  Finally, the guide mentions that the project is being managed in an issue driven fashion, but does not explain exactly what that means for developers (i.e. make a new issue for every new command / functionality).

Outside of the DevelopersGuide, JavaDocs and source code comments can help external developers understand the system.  Overall, the JavaDocs do a decent job of explaining what the functions do and what the expected arguments are even though there are a few misspellings (i.e. Comman's instead of Command's in the Command interface's printResult method's comment), fields are not commented, and the overview.html does not fit match the project.  Conversely, the source code comments are a little sketchy.  Some source files like EnergySince.java and HaleAlohaCli.java include inline comments to tell readers what the code is doing at a glance while others like CommandParser.java have no such comments, leaving external developers to figure out the code on their own.  Another problem is that most of the classes do not support information hiding.  Ideally, all fields in a class should be private and only be accessible through getter and setter methods, but several classes like EnergySince and HelpCommand declare their fields without the private keyword.  This makes them protected by default and allows any class within the package to manipulate the fields which gives external developers the ability to put objects into possibly invalid states and unknowingly break the system.  All in all, this project does not satisfy the third and final prime directive very well as it fails to explain all of the expectations that are in place when extending the system, does not consistently support information hiding, and leaves a few critical things up to the developer to figure out on their own.

Other Expectations
In addition to fulfilling the three prime directives of open source software, there some additional requirements that the system should meet.  One such requirement is that the system should be well tested so that it is easy to find out if enhancements broke pre-existing pieces of the system.  To be well tested, the developers should have created JUnit test cases to check most of the functionality of the system while executing a majority of the code for good code coverage.  Unfortunately, this system fails to do this as it includes only one real JUnit test case.  While there are other "test cases," they do not use JUnit and have a .txt extension so they are not automatically run by Ant when the system is verified.   Additionally, the one JUnit test case that is present only tests three out of the five methods in the class that it tests.  The result is a barely system that has about 18% of its code tested.  Hence, this system does not meet that requirement as there are virtually no tests to determine if any new additions broke existing components of the system or not.

Another expectation was the use of issue driven project management (IDPM) to evenly distribute the work between group members.  In IDPM, members meet every few days to divide the current tasks into small work units called issues which should be tracked on the Issues page of the project.  In addition, most commits to the systems should have an associated issue in its commit log to help explain why that change was made.  By looking at the Issues page it is quite clear who was responsible for what and it shows that the work load was not exactly even as Micah had 7 issues to work on while Richard and Jesse only had 4 issues.  They did not do a very good job at linking their commits to their issues either as only 19 out of 26 commits (~73%) are linked with an issue.  Some of the changes were minor though and can be excused, but that would still put them below the desired 90% of commits associated with an issue.  Therefore, despite their efforts, this group did not quite make the best use out of IDPM as they worked on this project.

The final expectation is the use of continuous integration.  Continuous integration tools attempt periodically to build and test the system (i.e. after every commit) to ensure that the system stays in an acceptable state.  If it fails to pass these tests, the project members are alerted so that they can find and fix the problem.  This project uses a Jenkins server which can be found here as mentioned in the project's DevelopersGuide.  By looking at the Jenkins server, we can get an idea of how development of the project progressed.  There are a total of 9 failed builds, but 5 of them were a result of the server used by the system being down and was out of the group's control.  The other 4 failed builds were promptly fixed though with an interval of five hours being the longest the system was in a failed state which is not too bad considering that this interval was in the middle of the day when the developers were probably busy with classes.  The rate of commits seems rather good too.  Between 11/10 and 11/15 and between 11/23 and 11/26 there were no commits made which implies that no significant work was completed in these time frames.  This equates to about one week of lost time which is definitely concerning, but the rate otherwise seemed quite consistent with a couple of commits every day or two.  Consequently, it looks like this project made good use out of the available continuous integration tools.

The build history of the project as found on the Jenkins server.

Conclusion
Overall, this project has some problems.  It fails to completely satisfy all three prime directives of open source software due to incomplete documentation and weak testing and it also seems to fall short of some of the other expectations from this project as the weak testing lead to poor test coverage and they were not quite able to use issue driven project management optimally.  Despite these issues, their use of continuous integration was good and the overall system works in most cases.  In conclusion, this project could use a few tweaks, but reviewing it as an outsider has been an interesting and enlightening look into the software development processes of other developers.

Monday, November 28, 2011

A CLI Project With JCEV Using IDPM

Acronyms galore!  After learning about various software development tools and practices throughout this semester, I finally had a chance to apply them to a real group project.  This project involved creating a command line interface (CLI) which can be used to perform specific queries on the WattDepot server collecting data from the Hale Aloha residence towers.  We did this project in groups of three and used Issue Driven Project Management (IDPM) to keep the work flow moving smoothly.  Here I shall describe the process my group used and the resulting hale-aloha-cli-jcev system.

What is IDPM
First off, what is this issue driven project management?  Issue driven project management works by holding meetings often (every couple of days) to divide the current tasks into small work units that would take a day or two to complete.  These work units are called issues and are distributed between the group members.  The main principle behind IDPM is to consistently make, work on, and complete these issues so that every member knows what to work on and what to do next.  This keeps the flow of work consistent and helps to keep the project moving ahead smoothly and efficiently.  We implemented this methodology using the Google Project Hosting website which conveniently has an "Issues" page where we could manage our tasks.  We made rather extensive use of this page and ended up making more than 40 issues throughout the course of the project which aided us in getting everything done in a timely fashion.

Our project's issues page near the end of the project.
The CLI
So how does the project itself work?  The system itself is comprised of three main components.  The first is the main control class which runs the command line interface and grabs the user's input.  The second is a processor class which takes the user's input and tries to invoke the command that the user wants to run.  The third and final part contains the various commands that the user can invoke.  The main idea behind this approach is that each component is self contained and can treat each of the other components as a "black-box" as long as the developer knows what will be passed to the component he is working on.  For example, someone implementing an new command does not really need to know how the control class or the processor works as long as he understands that he should implement the provided Command interface and knows what will be passed to his command as explained in the interface documentation.  This makes the system somewhat modular to make it easy for developers to add to it.  The current release version implements all of the expected functionality and includes a total of six working commands: help (return a list of commands), quit (exit the CLI), current-power (returns the current power usage of given tower/lounge in kW), daily-energy (returns energy used by the given tower/lounge on given date), energy-since (returns energy used by given tower/lounge from given date to now), and rank-towers (rank towers from least to most energy used on given interval of days).

However, there are a couple quirks.  One major quirk is the quit command which needs communicating with the control class to stop the application.  This is a special case which required the quit command to throw a special exception which would be passed up to the control class as the obvious solution of using System.exit is a bad practice.  This works, but it does not quite fit the intent behind the design as both the control and the processor classes had to be modified to make it work.  Another quirk is the fact that the table generated in the processor class has to be modified to add a new command.  Ideally, the user could just add a new class and the processor could dynamically find it and add it, but we did not have time to implement a reflection-based processor class and resorted to using a hard-coded table.  As a result, anyone adding a new command would need to change the table in the processor class by adding a new put call to the table generation function.

Working with JCEV
As with any group project, you need a group and I had the pleasure of working with Jordan Takayama and Eldon Visitacion in team JCEV.  While it took us a little bit to adjust to each other's coding styles, we easily created a functional system before the due date.  This gave us time to check over each other's code and fix minor errors and inconsistencies like using "kWh" versus "kilowatt-hours".  Everyone also made a JUnit test case for each class with good coverage to ensure that everything is working properly.  As a result, I believe that our system contains quality software that has been thoroughly tested.

Conclusion
Overall, I believe team JCEV effectively used IDPM to create a CLI system.  We made good use out of IDPM to distribute and track the work that needed to be done which allowed us to finish with enough time to perform extensive checks and fix the tiny, non-system breaking errors.  Consequently, I believe that hale-aloha-cli-jcev is a quality software system that has been well tested to ensure that it works.  Not only that, but I have also gained valuable experience by putting all of the software engineering practices to use in an actual group project.  I have learned a lot from this project and I am definitely looking forward to next time!

Tuesday, November 8, 2011

Energizing Exercises: WattDepot Code Katas

If you have been keeping up with this blog, it should come as no surprise that energy-related research and software engineering go hand-in-hand and here we shall have our first glimpse at how these two seemingly unrelated fields can assist each other.  To recap, one way that software engineering can assist energy research is through data collection and analysis and software packages like WattDepot can be used to do just that (and more!).  As with the previous systems, we shall learn the basics of WattDepot through a set of simple exercises or katas to get some hands on experience as we get our feet wet.



Kata 1: SourceListing
Implement a class called SourceListing, whose main() method accepts one argument (the URL of the WattDepot server) and which writes out the URL argument and a list of all sources defined on that server and their descriptions, sorted in alphabetical order by source name.  Use the System.out.format method to provide a nicely formatted list. 
As the first exercise, Kata 1 was relatively simple.  It was made even simpler as the example program essentially does this already.  While the code itself was quick and easy, I was not sure if the sources retrieved from the server were always being retrieved in alphabetical order.  It took a while, but I eventually convinced myself that it was sorted and this uncertainty made exercise take about 25 minutes.


Kata 2: SourceLatency
Implement a class called SourceLatency, whose main() method accepts one argument (the URL of the WattDepot server) and which writes out the URL argument and a list of all sources defined on that server and the number of seconds since data was received for that source, sorted in ascending order by this latency value.  If no data has every been received for that source, indicate that.  Use the System.out.format method to provide a nicely formatted list.
The second exercise was rather straight forward as well.  It took a little bit of data manipulation to calculate the latency, but this was also shown in the example program.  The bulk of the time spent here was spent trying to figure out how to get the values sorted by latency.  After thinking about it for a few minutes I decided to implement a LatencyData class which stores a source name and latency value.  LatencyData implements the Comparable interface and contains methods to make it ordered on the latency value.  As a result, all I had to do was make a bunch of LatencyData objects, throw them into an ArrayList, use Collections.sort to get the results in the right order, and print them.  All of this took about 40 minutes since I had to override several methods for the LatencyData class.  However, I should have made a general sorting class to be used by the later exercises that require data to be sorted on a value other than name.  I chose to implement separate classes for each type of data in the later exercises (i.e. EnergyData and PowerData) and a general class would have saved time and effort.



Kata 3: SourceHierarchy
Implement a class called SourceHierarchy, whose main() method accepts one argument (the URL of the WattDepot server) and which writes out the URL argument and a hierarchical list of all sources defined on that server.  The hierarchy represents the source and subsource relationship between sources.
Here is where things started to get a bit more interesting.  Getting the subsources is rather simple using the getSubSources method, using the resulting object's getHref method, and then getting the subsource's name by retrieving the sub-string after the final '/', but making sure that the indentations were right and that subsources were not printed as top-level sources was a little bit trickier.  To print the subsources, I decided to write a recursive method that prints the name of the top-level source, then calls itself on each subsource with a larger indentation.  This way, it is guaranteed to print the hierarchy properly no matter how many levels of subsources are present.  In addition, the method removes any of the found subsources from the original source list, preventing the issue of them being printed again as top level sources.  All in all, this took about 45 minutes to figure out and implement.


Kata 4: EnergyYesterday
Implement a class called EnergyYesterday, whose main() method accepts one argument (the URL of the WattDepot server) and which writes out the URL argument and a list of all sources defined on that server and the amount of energy in watt-hours consumed by that source during the previous day, sorted in ascending order by watt-hours of consumption.  If no energy has every been consumed by that source, indicate zero.  Use the System.out.format method to provide a nicely formatted list.
Kata 4 was probably the most time consuming of them all.  While it seemed simple at first, getting the time stamps for the beginning and end of yesterday proved to be a little trickier than I thought.  To calculate the day of yesterday, I decided to use the java.util.Calendar class (though in hindsight, it would have been easier just to use the built in Tstamp functions).  I then used this Calendar object to create a string which would be used to make a time stamp to be passed to the getEnergyConsumed function.  However, I neglected to notice that the month field of the Calendar class was actually one less than the traditional numbering system (i.e. January is 0) so my time stamps were invalid.  This caused me to get a lot of BadXmlExceptions as the range I specified was outside of the stored sensor data.  It took me multiple sessions to figure this problem out and this exercise took at least 4 hours to complete.  On the bright side though, I was able to apply the same time stamp generation methods the following katas which made them much easier.


Kata 5: HighestRecordedPowerYesterday
Implement a class called HighestRecordedPowerYesterday, whose main() method accepts one argument (the URL of the WattDepot server) and which writes out the URL argument and a list of all sources defined on that server and the highest recorded power associated with that source during the previous day, sorted in ascending order by watts.  Also indicate the time when that power value was observed. If no power data is associated with that source, indicate that.  Use the System.out.format method to provide a nicely formatted list.
While this one sounded more complex than Kata 4 at first, it was relatively simple since I had the time stamps figured out.  The two main issues here concerned virtual sources (aggregates of non-virtual subsources) and finding the maximum power recorded.  Dealing with virtual sources can be tricky since they do not contain data points themselves and the subsources may not be synchronized.  However, this can be handled by sampling the data at large intervals to account for the discrepancies in synchronization (15 minutes in my case) allowing virtual sources to create data points by aggregating the data of its subsources.  The second issue of finding the maximum power is then easily brute-forced by iterating through all of the resulting data points and simply storing the one with the largest value.  Consequently, this one only took about 35 minutes to complete.


Kata 6: MondayAverageEnergy
Implement a class called MondayAverageEnergy, whose main() method accepts one argument (the URL of the WattDepot server) and which writes out the URL argument and a list of all sources defined on that server and the average energy consumed by that source during the previous two Mondays, sorted in ascending order by watt-hours.  Use the System.out.format method to provide a nicely formatted list.
Surprisingly, the final kata was not too difficult.  After playing the the Calendar class in the previous exercises, it was fairly straightforward to find the previous two Mondays.  The only tricky part was getting the previous week's Monday if today is Sunday or Monday since there would not be a complete data set for that week's Monday.  Other than that, getting the two energy consumed readings and averaging them was very simple.  As a result, this exercise was completed relatively quickly with a time of 25 minutes.




Overall, using these exercises as an introduction to WattDepot has shown me just how useful such tools can be in collecting and analyzing energy data.  With WattDepot, programmers do not have to go out of their way to get the data they want so they can spend their time doing meaningful analysis of that data.  While I did encounter a rather frustrating non-WattDepot related problem as I worked through these katas, the WattDepot API proved to be simple and painless to use.  So if you are interested in doing some energy data collection or analysis, WattDepot seems to be the way to go!

Tuesday, November 1, 2011

Energy in Hawaii and Software Engineering

Introduction
Hawaii is a unique place with its own unique set of problems.  This is especially true when it comes to the issue of electrical energy in terms of both generation and consumption.  So why would this be covered in a software engineering log?  Of course, this is an important issue for the residents of Hawaii to consider, but it is also producing some new and interesting software development opportunities.  Thus, let's take a look at some of the unique issues of Hawaii's energy situation and some of the efforts that are being taken to alleviate them.

Electrical Energy vs. Power
Before we begin, what is electrical energy?  Well, obviously we know that electricity is the thing that comes out of our outlets, but how do we measure it?  It turns out that there are actually two ways to measure electricity: energy and power.  Energy is the thing that makes our appliances run and makes electricity useful.  This is usually measured in kilowatt hours.  But why are our electrical appliances rated according to wattage?  These are actually ratings of power which tells us the rate at which these appliances consume electrical energy.  Consequently, the energy used is a factor of both the power being drawn by the appliance and the time it is running (energy = power * time) and the power is the rate at which the energy is being converted (power = energy / time).  Hence, these two related concepts are much more different than they appear to be at first glance.  For more information and a different explanation, please consult this video.

Issues
Now that we understand energy, what are the issues concerning electrical energy in Hawaii?  Of course, this energy must be generated in some fashion by the electric company and these are typically generated by using non-renewable resources.  While mainland power plants can use these resources, they often opt to use cheaper local resources when they are available.  However, due to Hawaii's unique location and available local resources most of its power generation comes from expensive imported oil and coal making energy costs much higher than on the mainland.  In addition, the mainland states make use of a single massive power grid to spread the workload and increase its efficiency.  On the other hand, Hawaii's geography prevents the creation of a single statewide grid, resulting in the use multiple much smaller and less efficient power stations to supply each island.  Therefore, Hawaii has a rather unique position on the energy front in comparison to other mainland states.

Efforts
With all of these problems, there must be a solution right?  Well look no further as the Hawaii Clean Energy Initiative (HCEI) has laid down what needs to be done to alleviate the aforementioned issues.  The HCEI aims to decrease energy usage by 30% while increasing clean energy generation by 40%.  This would lead to an astounding 70% increase in clean energy and be a huge step in improving Hawaii's energy future.  To do this, residents must increase their energy efficiency by following less wasteful energy usage practices such as keeping your refrigerator in a cool location or replacing your incandescent light bulbs with compact fluorescent light bulbs or Light-Emitting Diodes.  In addition, more energy would be generated through clean and renewable means such as using solar power or using wind turbines to capture the energy of the ever present wind.  As a result, both more efficient energy usage and the incorporation of local renewable energy resources are needed to help solve Hawaii's energy problems.

Software Engineering Opportunities
So where does software engineering fit into all of this?  For one, researchers will need to collect and analyze a lot of data to quantify the effectiveness of their energy efficiency efforts and they will need software tools to perform these tasks.  Consequently, they will need software engineers with energy knowledge to design and create these tools.  Furthermore, incorporating renewable energy resources will require smarter power grid technology.  These new sources must be added and removed intelligently as just throwing more energy into the grid would be wasteful as the non-renewable generators would still be going at normal capacity.   Conversely, removing too many of those generators would cause the entire grid to become overloaded and fail.  As a result, software will be needed to track both power generation and consumption to help incorporate these new energy sources safely and effectively.

Conclusion
Overall, Hawaii is a unique place with its own unique set of energy problems.  To solve these issues, Hawaii must both attempt to improve its energy efficiency and work on tapping the many local renewable resources.  However, both of these tasks will require the aid of software tools, providing software engineers with many research opportunities.  Hopefully you have gained a better understanding of Hawaii's current energy situation and can see how this seemingly unrelated field can be beneficial to the local software engineers.  So save energy and keep green.  Perhaps a software engineering opportunity might manifest itself as a result!

Monday, October 24, 2011

5 Important Software Engineering Practices / Concepts

It's been a good semester of Software Engineering so far and I have learned a lot.  Of course there are too many important tidbits too share in one post, but here I will share 5 points that stuck out to me which will you will hopefully find interesting or useful (plus it will help me review for the upcoming midterm!).

  1. Why should you always override the hashCode method when you override that object's equals method?

    The contract of the hashCode method states that if two objects are equal according to that object's equals method, then calling the hashCode method on the two objects must return the same integer. Since modifying the equals method of the object can change the definition of what makes two object equal, the hashCode method must also be overridden to stay consistent with the new definition.

  2. What are the four properties that any implementation of the equals function should exhibit when comparing two non-null objects?

    1. Reflexivity: x.equals(x) must return true.
    2. Symmetry: x.equals(y) must return the same value as y.equals(x).
    3. Transitivity: If x.equals(y) returns true and y.equals(z) returns true, then x.equals(z) must also return true.
    4. Consistency: x.equals(y) must return the same value over multiple calls to equals given that no information used in the comparison is changed.

  3. Which property of the property Ant tag should you use to make properties representing paths? Why?

    While the value can work, you should use the location property of the property tag because the location property will change the file separators (i.e. '/' or '\') to the one that the operating system uses while the value property will not. Hence, using the location tag allows for better portability as it will change the separators in the paths to match the operating system invoking Ant.

  4. What is the syntax do you use to access the value of a property defined in an Ant build file? For example, how would you access the value of:
    <property name="new.property" value="new />

    Surround the property name with ${ }. So ${new.property} may be used to access the value of the sample property above.

  5. What are the three general categories of test cases?

    1. Acceptance tests: The program achieves and passes some basic requirement. For example, a Robocode robot always wins against a certain sample robot.
    2. Behavioral tests: The implementation actually works as intended. For instance, the Robocode robot actually follows the strategy by moving to the specified locations through out the battle.
    3. Unit tests: Test small self-contained classes or methods in isolation. For example, testing the output of a Robocode robot's fire-control method to ensure that the output matches the expected values without actually running the battles to check the robot's behavior.

And there you have 5 of the software engineering highlights that I have seen so far. Hopefully you have picked up something new and useful and I hope to share more with you as I continue my journey through the wonderful realm of software engineering!

Thursday, October 20, 2011

Group Projects Made Easy - Subversion and Google Project Hosting

Introduction
In software engineering, there are many problems that are just too large for a single developer to tackle in any timely fashion and require multiple software developers to work in tandem. However, this can lead to many problems. For example, how does one ensure that they are working on the latest version of the project or that one's changes are not overwritten by another developer's modifications? These problems are magnified as more and more developers are added to the project, yet large projects with many programmers are common and run successfully without these problems. So how do they do this? The answer is that they use configuration management tools like Subversion. Here I shall share my initial experiences in using the Subversion configuration management tool and the Google Project Hosting website to host my CrazyTracker robot.



Subversion and Google Project Hosting
So we want to collaborate with others on this CrazyTracker robot, but how do we do it? First off, we need to get the tools required for Subversion. There are two main components of Subversion: a client program to communicate with the source code repository and a server to host said repository. I use a Windows operating system so I chose to use the TortoiseSVN client which allows the user to right click a folder and choose various commands from the pop up window. As for the server, we can easily create one by making a new project in Google Project Hosting. Once those are set up, it is just a simple matter of checking out the empty source directory of our new project, adding the CrazyTracker project's files in TortoiseSVN, and committing the changes to upload the the project to the Google Project Hosting server.

Now that the code is up there, we have to add some documentation in the form of wiki pages (i.e. user / developer guides) and add some "committers" to work on the project. Once that is completed, anyone can see the source code and anyone with committer permissions can upload their changes to the repository using Subversion.

Some of the commands available through the TortoiseSVN user interface.
Includes the "Blame" command which shows the last person that modified each line.




Conclusion
Overall, setting up the robocode-fch-crazytracker project on Google Project Hosting was relatively quick, easy, and painless. TortoiseSVN installed with no problems on my Windows computer and Google's simple and intuitive interface made creating the project and its documentation quite painless. If anything, the only gripes I had about the whole process was the fact that I had to check out the empty project to upload the original source files (why can't we add source files when creating the project?) and that the wiki markup language assumes that words with capital letters are links to other wiki pages and puts a "?" link after them. As a result, I had to put a "!" before those words to make those links go away which was a simple fix, but it did get annoying after a while. However, my first experience with Subversion and Google Project Hosting was definitely a good one and I can see how such tools can help with collaboration. Subversion in particular has some very interesting features (for instance, it can tell you who changed which lines of code with the blame function!) and I am definitely looking forward to using these tools for my future group projects. Now that you have seen just how easy it is to set up the tools needed to make group work that much less painful, let's get collaborating!

Tuesday, October 11, 2011

CrazyTracker

Introduction
If you have been keeping track of this blog, you might remember the post on Robocode katas.  While those exercises may have been a little mundane, they were my first step on the road to the ICS 314 Robocode Tournament for which I had to create my own robot to compete along with the various test cases needed to ensure that it works as intended.  Consequently, I have created a slightly more complex robot called "CrazyTracker" to compete in the tournament (and hopefully win!).

Overview
Robocode robots have three main functions: movement, firing, and targeting.  CrazyTracker incorporates all of these in different ways.

Movement:
CrazyTracker uses a random movement pattern (hence the crazy) where it moves to a random point between 50 and 150 pixels away every iteration of the run loop.  The idea is to prevent it from being predictable so it can dodge as many of the opponents bullets as possible.  In addition, it will move in a random direction away from a wall or rammed robot for a random distance between 50 and 100 pixels.  This is to prevent it from getting hung up on walls or enemies and to keep those enemies at a distance to reduce the amount of damage taken.

Using random movement to dodge enemy bullets.


Firing:
CrazyTracker uses two factors in it's firing algorithm as it uses both the current hit rate (how many bullets have hit so far / total number of bullets fired) and distance to determine the strength of the bullet to fire, if it should fire at all.  Here, the robot tries to reduce the energy wasted on missed bullets as it fires weaker, but cheaper bullets at longer distances and only fires at closer distances if the hit rate is too low.

Tracking:
CrazyTracker uses a simple targeting scheme.  The targeting algorithm basically just scans for the enemy, points the gun at it, and shoots.  While I did want to add some leading if the enemy robot is moving, the current algorithm only aims for the center of the enemy.  While this means that it is not as accurate as it could be, it still allows CrazyTracker to fire at enemies regardless of which way its body is facing which varies a lot due to the random nature of its movement.

The targeting algorithm faces the gun towards the enemy when it is time to fire.

Results
While I believe that my strategy is sound, it did not turn out so well in practice.  CrazyTracker can only reliably beat the SittingDuck, Crazy, and Fire robots and even then, it does lose to Crazy and Fire every once in a while due to its random nature.  The results versus the other sample robots are even worse with a 48% win rate against Tracker, 41% against Corners, 19% against RamFire, 18% against Spinbot, and an abysmal 15% against Walls.  This is probably due to the targeting algorithm as it fires at the center of the enemy, hence it has a hard time hitting fast moving or non-linearly moving targets like Walls or Spinbot.  Here some leading could have helped.  The low win rate against RamFire is likely due to the random nature as to how CrazyTracker handles running into another robot.  This means that it can get stuck in a corner or just spin in circles if it is unlucky, hence giving RamFire the advantage.  That behavior could be fixed, but I wanted to keep that random aspect even though it could create a disadvantage.

Testing
To check the the functionality of the CrazyTracker robot, I have created 6  JUnit test cases.  Two of these test cases are acceptance test cases which just check to see that CrazyTracker is in first place against a certain robot (Crazy and Fire in my case) and can beat that robot at least 80% of the time to account for its random nature.

In addition to these acceptance tests, I also have two unit tests.  The first unit test checks the a subclass of Robot that I implemented that provides a moveToPoint method and ensures that the moveToPoint method actually moves the robot to that point and that it properly handles points that are out of reach (i.e. outside of the battlefield) by trying to move to the closest valid point.  The second unit test checks CrazyTracker's calculatePower method.  This method determines the power of the bullet that the robot should fire (if any) depending on the its current hit rate and the distance from the target and the unit test ensures that it returns the expected values at the boundary cases.

Finally, I created two behavioral tests to check the firing and tracking behaviors of the robot.  For firing, I wanted to make sure that only the bullets of the powers specified by the calculatePower method were being fired, so it checks all of the bullets and sets a flag variable to true if a bullet with an unexpected power is found.  The tracking test pits CrazyTracker against the SittingDuck robot and checks that CrazyTracker's power never drops below 95.  This would be the case if the tracking algorithm works properly as each successful hit would restore energy, hence the making the robot gain energy if the bullets hit as they should.  The power threshold of 95 must be  used however as the initial bullet shot will drop the energy below 100 until it hits.

All in all, I believe these test cases cover the majority of CrazyTracker's features and functions and help ensure that everything is working as intended.

Conclusion
Overall, CrazyTracker was not only a good experience in learning Robocode, but also a great way to learn software engineering techniques like the use of automated quality assurance tools. As mentioned, CrazyTracker made me learn how to use testing tools like JUnit to help ensure that everything works and it has also showed the usefulness of other tools like Checkstyle, PMD, and Findbugs.  These tools do not do much in terms of testing, but they help to verify that my code is properly formatted and that there are no obvious bugs.  Finally, this project has taught me that adamantly sticking to a plan is not always the best idea.  From beginning to end, I kept the same general strategy behind the CrazyTracker robot even though it became obvious that it would be hard to beat many of the sample robots with that design.  Nevertheless, I stuck with that strategy until it was too late to make any real changes.  As a result, I learned that I should be more flexible as that could lead to more optimal solutions.  However, I do hope that CrazyTracker will be able to hold its own against the other robots in the tournament despite its weakness and hopefully the random number generator will return values that are in my favor.    Go CrazyTracker!

Wednesday, September 28, 2011

Building Systems with Ant

Introduction

Ever wish there was an easier way to build your Java projects?  Tired of using the command line to manually run all of your fancy tools?  Well wait no longer as Ant can automate your build processes with the help of a few user-defined XML files.  These XML files are created using a special subset of the XML language and to learn this we shall once again use the concept of code katas.

Tasks
  1. Ant Hello World - Print Hello World.  Learn the <echo> element
  2. Ant Immutable Properties - Show that Ant properties are immutable once they are set.  Learn the <property> element.
  3. Ant Dependencies - Learn how the depends attribute of the <target> tag works.
  4. Hello Ant Compilation - Use the <javac> element to compile a Java program.
  5. Hello Ant Execution - Use the <java> element to run a Java program.
  6. Hello Ant Documentation - Use the <javadoc> element to generate the documentation for a Java program.
  7. Cleaning Hello Ant - Create a target that deletes the build directory to clean the system.
  8. Packaging Hello Ant - Clean the system and pack it into a zip for distribution.

Experiences

Most of the katas were simple and easy to complete using a few targets and lines of code.  The only kata that gave me problems was the Packaging Hello Ant kata.  While it was simple to package all of the contents of the working directory into a zip archive, it was slightly more tricky to pack all of those contents within a folder inside of the zip archive.  To do this, I had to use the <copy> element to copy the contents of the working directory to a new temporary directory.  Then I could specify that I wanted the new directory and all its contents to be included by the <zip> element to create a zip archive with the desired structure and delete that temporary folder.  Hence, this exercise took a little more thinking as to how I would get the appropriate structure for the zip archive.

From these exercises, I learned how convenient build systems can be.  With a single command, I could compile a program, run it, generate documentation for it, clean the project, and pack it for distribution.  This would take many command-line operations and being able to do it just by typing "ant -f dist.build.xml" saves a lot of effort and time.

Overall, these exercises have given me a better appreciation for build systems, especially Ant.  Not only are they extremely convenient, they are also quite simple to use as the basic XML language used to direct Ant is rather simple and well documented.  Consequently, I am definitely looking forward to seeing Ant's true power as I start to work on bigger and bigger systems and see just how much simpler it makes the building process.

Tuesday, September 20, 2011

Code Katas and Robocode

How does one become great at something?   Of course, there are many possible factors, but the bulk of one's progress towards greatness comes from practice.  From learning a musical instrument to martial arts, practice is a key component to becoming better at whatever task is at hand.   Programming is no different, but developers tend to do most of their practicing on the job  leading to mistakes and difficulties as they work on real projects. To combat this, developers can use what are called code katas. These are short, open-ended programming problems to be solved outside of a project environment and are designed to serve as practice sessions for software developers.  There are many different katas for many different systems and languages, but here we shall focus on a set of 13 katas for the Robocode system in the Java programming language (Learn more about code katas!).  Now what is Robocode you may be thinking?  Robocode is an open source software system that allows users to build software tanks that can move, find enemies using its radar, and fight using its gun and these katas help to introduce new users, such as myself, to the basics needed to create an unbeatable fighting machine.  (Learn more about and download Robocode)


The Katas

  • Position01: The minimal robot. Does absolutely nothing at all. 
  • Position02: Move forward a total of 100 pixels per turn. When you hit a wall, reverse direction.
  • Position03: Each turn, move forward a total of N pixels per turn, then turn right. N is initialized to 15, and increases by 15 per turn.
  • Position04: Move to the center of the playing field, spin around in a circle, and stop.
  • Position05: Move to the upper right corner. Then move to the lower left corner. Then move to the upper left corner. Then move to the lower right corner.
  • Position06: Move to the center, then move in a circle with a radius of approximately 100 pixels, ending up where you started.
  • Follow01: Pick one enemy and follow them.
  • Follow02: Pick one enemy and follow them, but stop if your robot gets within 50 pixels of them.
  • Follow03: Each turn, Find the closest enemy, and move in the opposite direction by 100 pixels, then stop.
  • Boom01: Sit still. Rotate gun. When it is pointing at an enemy, fire.
  • Boom02: Sit still. Pick one enemy. Only fire your gun when it is pointing at the chosen enemy.
  • Boom03: Sit still. Rotate gun. When it is pointing at an enemy, use bullet power proportional to the distance of the enemy from you. The farther away the enemy, the less power your bullet should use (since far targets increase the odds that the bullet will miss). 
  • Boom04: Sit still. Pick one enemy and attempt to track it with your gun. In other words, try to have your gun always pointing at that enemy. Don't fire (you don't want to kill it). 

Completing the Katas


While working on the katas, it became apparent that the difficulty level between exercises varied quite a bit.  Position01 and Position03 were quite straight forward as only simple calls to the Robocode library functions were needed and others like Position02, Boom01 to Boom03, and the Follow katas added the use simple event handling concepts.  These were quickly completed after a few looks at the Robocode API documentation and served as a stepping stone for the harder katas.

However, Boom04 was slightly more complicated as the gun and the robot could have different headings, hence the heading that the gun needed to point to had to be calculated.  The only way to find the position of an enemy is to use the getBearing() method which returns the angle of the enemy relative to the robot's current heading.  Hence, I had to calculate what heading that bearing corresponded to so that I could rotate the gun to the enemy's position.  I also decided to add some checks to ensure that the gun took the shortest path possible to the enemy's position as any turns greater than 180 degrees would be changed to a turn in the opposite direction.




Furthermore, Position04 and Position05 introduced more complications.  These two katas ask for a robot that can move to a point and the library does not provide any functions like that.  Consequently, I had to create my own moveToPoint functions.  These function use trigonometry (http://www.clarku.edu/~djoyce/trig/ was a big help!) to determine the heading and distance to the point, turn the robot to that heading, and moves the determined distance.  Initially, these functions had a bug where the robot would get on top of the point, but slowly move even though it should have detected that it was on the point and stopped.  This problem took a while to debug and it was due to the fact that the Robocode system uses double floating point values for the coordinates of the battlefield which made finding exact positions a little tricky.  For example, if I told to robot to go to (400, 600) and the Robocode system had the robot's position at (400.0000000000001, 600.0) it would constantly try to adjust and never stop as the coordinates are not equal.  As a result, I decided to round the coordinates to integer values of the long data-type to prevent such precision errors as the fraction-of-a-pixel differences are unnoticeable.  Position05 also introduced another problem in that a robot cannot actually reach the corners of the battlefield due to its body dimensions and would push against the wall ad infinitum.  Therefore, I had to modify the moveToPoint functions to adjust points that are outside of the reachable range.  Since these functions are shared by the last three Position katas, I decided to put them in a separate subclass of the Robot class to prevent a lot of copy and pasting (You may find the source code for this subclass here).

Finally, there is Position06.  This kata built on the problems from Position04 and Position05 with the added requirement that the robot must go in a circle with a radius of ~100 pixels.  Initially, I thought that I could use the circling method of the Spinbot provided in the samples, but realized that it was just constantly turning so that its circling radius was unknown.  Instead, I had to use the circle equation to calculate the points to move to.  I decided to provide the x coordinates and the radius to find the y coordinates of the target points, but this introduced another problem as the circle equation accounts for both positive and negative square roots which made the robot stall at points where the sign would change if only the Math.sqrt function is used.  To combat this, my robot changes the sign of the square roots when it reaches the points where the square root should change signs (90 and 270 degrees).

Conclusion


Now that I have completed these 13 code katas, I feel that I have the knowledge needed to create a competitive robot.  These exercises familiarized me with the basic movement, detection, and gun-use functions and has given me some ideas as to how to create a competitive robot.  For instance, I know that a competitive robot will need a lot of movement as dodging enemy bullets makes them waste energy and saves your own (some of the Position robots actually beat some of the sample robots by making them run out of energy even though they themselves did not even fire a single shot!).  I also understand that I will need to create a good tracking scheme so that most of my bullets hit.  Perhaps I could lead the target if it is moving to increase the accuracy of my robot's shots.  Finally, I think that code katas are a good learning device.  As said by many people, the best way to learn how to program is to actually do it and these katas provide short and interesting tasks to hone one's programming skills without the pressures provided in a project setting.  Thus, code katas like these are a good way to work on one's programming abilities off of the job.

Tuesday, August 30, 2011

FizzBuzz!

FizzBuzz is a simple program that can be used to test one's ability to complete basic programming tasks. To comply with the requirements of the FizzBuzz program, the created program should print out the numbers from 1 to 100 on separate lines (one output per line) with the exception of multiples of 3, for which the program should print "Fizz", and 5 where the number should be replaced by "Buzz". If the number is a multiple of both 3 and 5, the program should print "FizzBuzz". The task here was to implement this program in Java using the Eclipse IDE in an effort to familiarize ourselves with it.



Initial Program
Below is the Java source code that I created to implement the FizzBuzz program. This implementation took me 5 minutes and 9 seconds despite the assistance of an advanced IDE like Eclipse due to an error I made while creating the FizzBuzz.java class file. When I created the class file, I tried to specify that it should be in a non-default package, but I made the mistake of doing this in the top-level project folder instead of the src folder. This caused Eclipse to create several new folders to place the new class file in instead of creating a new package as I had intended. Consequently, I could not get the created file to run until I created a new package in the src folder and moved the FizzBuzz.java file there.

What Eclipse did when I tried to make FizzBuzz.java in a non-default package in the top-level project folder.
What the non-default package should look like.
The source code for the FizzBuzz.java class may be seen below.



A Better FizzBuzz
While this code does work as intended, it is not as elegant as it could be. In this format, it is impossible to use JUnit to test the file as JUnit would not be able to capture the program's output. As a result, we would need to create a function that the JUnit test can call as shown in the modified FizzBuzz program called FizzBuzz2.java.




The JUnit Test
With FizzBuzz2, the JUnit test can call the FizzBuzz2.fizzBuzz(int) function to test the output. The following JUnit test checks the outputs of the two end cases (1 and 100) as well as the outputs when 3, 5, 15, 25, 41, 51, 75, and 94 are passed to the function.


Fortunately, this test runs successfully and helps to assure us that the output of the FizzBuzz program is correct.



Conclusion
Creating the FizzBuzz program was a good exercise in using Eclipse to create a Java project from scratch and brought up an interesting quirk of the system as specifying a non-default package for a file in the main project directory caused strange behavior. It also exposes how software engineering is not a race to get a working product out. Of course, producing a working product out as fast as possible sounds good, but revising that product could lead to a much more elegant design that will save time and effort later on. Finally, the FizzBuzz program highlighted the need to take pay attention or take notes in ICS 314. I am unfamiliar with JUnit so I had no idea how to use until I looked at the notes that I took on the first day of class. Hence, it is important to pay attention to the various course materials (non-webcast lectures included!) since knowing that material (or at least keeping a reference to look up) could really help with the assignments that we will do in the future.

Sunday, August 28, 2011

orDrumbox and the Three Prime Directives of Open Source Software

In our ICS 314 class, Dr. Johnson has established three criteria that open source software should be judged against to determine its quality.  Here we shall examine the quality of the open source Java project called orDrumbox using these three criteria.

Overview
The orDrumbox project aims to create a software drum machine with additional song composition functions.  These functions include the ability to create and modify multiple instrument tracks to create new songs and an automatic fill engine to add various patterns to the songs.  For this test, I have downloaded the setup-ordrumbox-0.9.06-win32.exe installer and the source files from the project's sourceforge.net page.

For more information, please see the project's home website.

Prime Directive #1
The first prime directive is if the system accomplishes a useful task.  orDrumbox appears to accomplish its task rather effectively with an easy-to-use interface that allows users to choose song templates and edit them using the various buttons and clicking on the step sequencer.  It also allows the user to play these songs, but the sound quality seems to be lacking as I heard a lot of static.  While the poor sound quality detracts from the user's experience, it is good enough to give the user a clear picture as to how the song sounds.  All in all, orDrumbox succeeds in fulfilling its task as a drum machine and a song composition tool.



The orDrumbox user interface with a pre-made song loaded.
A modified version of the pre-made template with extra cymbal hits and a new cowbell track.  The changes are in red.

Prime Directive #2
The second prime directive concerns the ease of the software's installation by an external user.  orDrumbox was extremely easy to install thanks to the setup executable provided which makes it so that all the user needs to do is have Java 1.6 installed and follow the installer.  The installer can even create a shortcut on your desktop for the user!  I did have some problems getting the program to launch from the shortcut, but I believe this was due to the way that my system is set up and I could easily launch it from the command line using the java -jar command or executing the Launcher script in the installation directory.  As a result, orDrumbox easily passes the second directive.

The orDrumbox installer makes installation extremely simple for anyone who has installed a Windows program before.
Finished with the installation.  The installer even includes installation progress bars like most other Windows installers!

Prime Directive #3
The third and final directive is fulfilled if it is easy for an external developer to understand and modify/enhance the system.  Unfortunately, this is where orDrumbox falls short.  The package of source files fails to include any developer level documentation for external developers which is definitely not desirable.  In addition, the source code itself suffers from a lack of commenting as seen below.  However, some of the function and variable names make their purposes rather self explanatory which might make understanding the source code a little bit easier.

public Element toXml(Document xmldoc) {
Element scaleElement = xmldoc.createElement("scale");
scaleElement.setAttribute("display_name", getDisplayName());
scaleElement.setAttribute("freq", getFreq() + " ");
scaleElement.setAttribute("rand", getRand() + " ");
scaleElement.setAttribute("order", getOrder() + " ");
scaleElement.setAttribute("lgr_segment", getLgrSegment() + " ");

for (int iNumScalenote = 0; iNumScalenote < scaleNotes.size(); iNumScalenote++) {
Scalenote scalenote = (Scalenote) scaleNotes.get(iNumScalenote);
scaleElement.appendChild(scalenote.toXml(xmldoc));
}

return scaleElement;
}

By looking at the source code, the problems with trying to work on this system becomes apparent.  For example, an external developer would probably guess that the function above is used to export a scale element to a XML file based on its name, but the lack of any commenting makes it difficult to know for certain without going through each line of the function.  Consequently, orDrumbox fails the third directive as there is no documentation and very little commenting which forces any external developers to look at every line of code to understand what the program is actually doing.

Conclusion
Overall, the orDrumbox appears to successfully fulfill two of the three prime directive of open source software.  Not only does it accomplish the tasks that it intended to, but it is also extremely easy to install on a Windows system.  On the other hand, the lack of documentation and commenting would make external development of this system rather difficult and time consuming as the only way to understand the program is to go over every single line of code.  Therefore, the orDrumbox system successfully complies with the first two prime directives, but fails to meet the third prime directive of open source software.