Prime Directive 1: Does the system accomplish a useful task?
The WattDepot project and server, for those unfamiliar with it, is a project started by Professor Philip Johnson for the continuous monitoring of power consumption in the dormitories on the campus of University of Hawaii at Manoa. The system makes use of multiple sensors installed throughout the dormintories, each keeping track of their individual section. This information is then provided by the user by an API implemented in Java. This API was used by Team Pichu to provide for the CLI being assessed in this review. Having such an easy interface, this would allow the user to easily assess the energy consumption of a specific area in a building. This would allow for corporations to conduct Energy Audits which is a major part in reducing our energy consumption and dependence on foreign oil, as mentioned in my previous blog post. As such, the task provided by the CLI is incredibly useful.
Prime Directive 2: Can an external user can successfully install and use the system?
The command-line provided by Team Pichu is an excellent way of interacting with the WattDepot API. Although many programs nowadays make use of GUIs, command-lines are still an efficient way to control a program. In this case, we are provided with a .jar file that can be easily run via the command-line with java's -jar option. The client is fairly easy to understand from the user's standpoint, especially with the help of the documentation on Pichu's homepage as seen in this screenshot.
Besides the help page, however, the feedback provided to the user when given bad input is lacking. Outside of notifying the user that the wrong number of arguments has been passed, the program does not specify what is wrong with an argument. For instance, if a person mistyped the source name, or if the person provided an invalid date (like a date in the future) the program would simply state that an argument is invalid. Better feedback would be to specify what is wrong with a provided input without the user having to consult the help page.
Prime Directive 3: Can an external developer successfully understand and enhance the system?
Being developers, the spirit of providing a functional system would also be to provide something that can be easily extended, providing even more function. Team Pichu has done the right thing when it comes to their code and provided their code for free on their Google Project Hosting site. All one has to do in order to enhance the system would be to the SVN revision control system. Google's functionality allows for the user to simply input 'svn checkout http://hale-aloha-cli-pichu.googlecode.com/svn/trunk/ hale-aloha-cli-pichu-read-only' into the command line to download the system into the user's current path. From then on, the DeveloperGuide provided by Team Pichu provides simple instructions to build the system, their build tools they've used, and how to generate the documentation for the system, though none on extending it (importing it into Eclipse, for instance.)
Upon importing it into Eclipse, a dependency for the WattDepotClient is broken, as well. The .settings file provided by the team for use in Eclipse provided a literal classpath instead of a local classpath relative to the project's root. As such, the build path specified by the system was not the same on my system, making compilation and extension impossible. Luckily, the problem could be remedied by specifying the jar to be relative to the project, which would solve it for all environments.
The code itself is thoroughly documented with Javadoc, though there is little explanation as to the strategy implemented. For instance, even after generating and reading through the javadoc, I'm still not sure how the Command classes determine a valid source (it seems like it uses regex which generally isn't very readable or understandable to anyone who didn't implement it) or why they use the Date and SimpleDateFormat classes to check and generate timestamps.
The Developer Guide provides excellent instructions to continue their method of quality assurance. Their method involved Automated Quality Assurance tools CheckStyle, PMD and Findbugs which was activated with their provided ant build file verify.build.xml. Locally they emphasized the importance of using the provided tools to ensure a standardized, working code. They've also provided for a server using the Jenkins quality assurance tool to monitor their repository. Jenkins, using the same ant build file, monitors the state of the system and emails the developers in charge if any part of the verification system fails. This would allow members to quickly troubleshoot and restore the system to a working state in as little time as possible (looking at the previous changes to the system so far, each problem was fixed in roughly 20 minutes, which is better than, say, changing the system then running it later that day only to realize that it doesn't work.
The developer guide also emphasizes to continue the correctness of the program using JUnit test cases. Looking at the test cases and running the included jacoco build file, the team extensively tests the various commands using both good and bad input, resulting in a well-tested system with 85% of the instructions tested.
The team also had used Google Project Hosting's Issue feature to document the build process and relate each change to the source to a specific Issue. Looking at the issue and previous commits, it seems that two out of three of the team members had implemented most of the code, but did so in a way that is easy to see what changes were made and what issues related to it.
In all, even though some documentation were lacking and through a little troubleshooting, the system seems to be able to be extensible.