Taking upon another group's code is no easy task. The difficulty can range from perplexing to hair-pulling. Thankfully, taking on Team Pichu's implementation of a WattDepot Command Line fell into the former category (our friends at Team Teams, who inherited the system previously built by us at Team Cycuc unfortunately fell into the latter category.) As previously blogged about, the level of extensibility is made through open-sourcing one's code and providing thorough documentation. Another important part of making code easily extensible is simple, comprehensible code, which Team Pichu has provided. The process of extending it was no harder than creating a new Command that extended the provided interface and adding it to their processor class (which used branching rather than enumerations, which, while less elegant and efficient, is easier to understand.) As such, the members at Team Cycuc: Yong Hong Hsu, David Wilkie and I successfully create Commands that extended Team Pichu's implementation, though issues with communication meant for an implementation that does not meet specifications.
In many ways, it was easier to work with an existing implementation since issue-driven management means that one can easily assign tasks via Issues in Google Project Hosting, but the problem with Issues is that just because they're there doesn't mean that they're correct and that communication isn't required. The problem with our implementation came from poor planning and coordination, and assumption that all commands do not depend on each other. If we as a team had looked at the specifications thoroughly, we'd know what each of us had to do at what time. Monitor Goal, assigned to Yong Hong Hsu, relies on set-baseline and monitor-power to successfully monitor whether a source achieves a goal, but fails to correctly integrate the methods coded by David and I. In retrospect, my implementation wouldn't have worked as a monitoring-device since the power isn't automatically updated while MonitorPower.run() is running, though an Issue wasn't raised by Yong Hong nor was I contacted in any other method.
With this in mind, I can easily say that our build is a partial failure in terms of the first Prime Directive "Does the system accomplish a useful task?" since it provides partial functionality, as it sets baselines and monitors power, but doesn't allow for the user to monitor whether a tower lowers their energy consumption to a certain percentage of the baseline. It does, however, print out the power at a regular interval such that a user itself can monitor the power of a WattDepot tower.
The ease of which I had implemented MonitorPower however (it took less than a day of coding), shows that the second two Prime Directives are achieved, all credit to Team Pichu. In comparison to Team Teams reaction to taking over our code, their code was incredibly easy to understand due to its simple constructs, meaning that the KISS principal should be used always, especially if one expects to open-source their project to provide for more functionality from other users.
UPDATE(8:45 am): After much running-around-with-hair-on-fire coding, I've gotten monitor-goal and monitor-power to work roughly as specified. It can be found here , it's a last-minute-heroics edition of the interface, since that's what I had to do.
RBV Engineering
Wednesday, December 14, 2011
Friday, December 2, 2011
Technical Review: Hale-Aloha-CLI-Pichu
Technical Reviews are rigorous testing of a system, such that all aspects of the system are scrutinized, pointing out errors that the developers themselves wouldn't have noticed. This review will be of a group who had developed a command-line interface for the WattDepot server provided by Professor Philip Johnson, a task we were also recently charged with. As such, they were put under the same development conditions as us, incorporating Google Project Hosting and the Jenkins server found at http://dasha.ics.hawaii.edu:9859/ to use Continuous Integration and Issue Driven management.
The WattDepot project and server, for those unfamiliar with it, is a project started by Professor Philip Johnson for the continuous monitoring of power consumption in the dormitories on the campus of University of Hawaii at Manoa. The system makes use of multiple sensors installed throughout the dormintories, each keeping track of their individual section. This information is then provided by the user by an API implemented in Java. This API was used by Team Pichu to provide for the CLI being assessed in this review. Having such an easy interface, this would allow the user to easily assess the energy consumption of a specific area in a building. This would allow for corporations to conduct Energy Audits which is a major part in reducing our energy consumption and dependence on foreign oil, as mentioned in my previous blog post. As such, the task provided by the CLI is incredibly useful.
The command-line provided by Team Pichu is an excellent way of interacting with the WattDepot API. Although many programs nowadays make use of GUIs, command-lines are still an efficient way to control a program. In this case, we are provided with a .jar file that can be easily run via the command-line with java's -jar option. The client is fairly easy to understand from the user's standpoint, especially with the help of the documentation on Pichu's homepage as seen in this screenshot.
Having such easy documentation allows the user to quickly understand how to query and get meaningful data from the WattDepot servers. After downloading the distribution from their site, I quickly got all four commands running within a minute:
Being developers, the spirit of providing a functional system would also be to provide something that can be easily extended, providing even more function. Team Pichu has done the right thing when it comes to their code and provided their code for free on their Google Project Hosting site. All one has to do in order to enhance the system would be to the SVN revision control system. Google's functionality allows for the user to simply input 'svn checkout http://hale-aloha-cli-pichu.googlecode.com/svn/trunk/ hale-aloha-cli-pichu-read-only' into the command line to download the system into the user's current path. From then on, the DeveloperGuide provided by Team Pichu provides simple instructions to build the system, their build tools they've used, and how to generate the documentation for the system, though none on extending it (importing it into Eclipse, for instance.)
Upon importing it into Eclipse, a dependency for the WattDepotClient is broken, as well. The .settings file provided by the team for use in Eclipse provided a literal classpath instead of a local classpath relative to the project's root. As such, the build path specified by the system was not the same on my system, making compilation and extension impossible. Luckily, the problem could be remedied by specifying the jar to be relative to the project, which would solve it for all environments.
The code itself is thoroughly documented with Javadoc, though there is little explanation as to the strategy implemented. For instance, even after generating and reading through the javadoc, I'm still not sure how the Command classes determine a valid source (it seems like it uses regex which generally isn't very readable or understandable to anyone who didn't implement it) or why they use the Date and SimpleDateFormat classes to check and generate timestamps.
The Developer Guide provides excellent instructions to continue their method of quality assurance. Their method involved Automated Quality Assurance tools CheckStyle, PMD and Findbugs which was activated with their provided ant build file verify.build.xml. Locally they emphasized the importance of using the provided tools to ensure a standardized, working code. They've also provided for a server using the Jenkins quality assurance tool to monitor their repository. Jenkins, using the same ant build file, monitors the state of the system and emails the developers in charge if any part of the verification system fails. This would allow members to quickly troubleshoot and restore the system to a working state in as little time as possible (looking at the previous changes to the system so far, each problem was fixed in roughly 20 minutes, which is better than, say, changing the system then running it later that day only to realize that it doesn't work.
The developer guide also emphasizes to continue the correctness of the program using JUnit test cases. Looking at the test cases and running the included jacoco build file, the team extensively tests the various commands using both good and bad input, resulting in a well-tested system with 85% of the instructions tested.
The team also had used Google Project Hosting's Issue feature to document the build process and relate each change to the source to a specific Issue. Looking at the issue and previous commits, it seems that two out of three of the team members had implemented most of the code, but did so in a way that is easy to see what changes were made and what issues related to it.
In all, even though some documentation were lacking and through a little troubleshooting, the system seems to be able to be extensible.
Prime Directive 1: Does the system accomplish a useful task?
The WattDepot project and server, for those unfamiliar with it, is a project started by Professor Philip Johnson for the continuous monitoring of power consumption in the dormitories on the campus of University of Hawaii at Manoa. The system makes use of multiple sensors installed throughout the dormintories, each keeping track of their individual section. This information is then provided by the user by an API implemented in Java. This API was used by Team Pichu to provide for the CLI being assessed in this review. Having such an easy interface, this would allow the user to easily assess the energy consumption of a specific area in a building. This would allow for corporations to conduct Energy Audits which is a major part in reducing our energy consumption and dependence on foreign oil, as mentioned in my previous blog post. As such, the task provided by the CLI is incredibly useful.
Prime Directive 2: Can an external user can successfully install and use the system?
The command-line provided by Team Pichu is an excellent way of interacting with the WattDepot API. Although many programs nowadays make use of GUIs, command-lines are still an efficient way to control a program. In this case, we are provided with a .jar file that can be easily run via the command-line with java's -jar option. The client is fairly easy to understand from the user's standpoint, especially with the help of the documentation on Pichu's homepage as seen in this screenshot.
Having such easy documentation allows the user to quickly understand how to query and get meaningful data from the WattDepot servers. After downloading the distribution from their site, I quickly got all four commands running within a minute:
Besides the help page, however, the feedback provided to the user when given bad input is lacking. Outside of notifying the user that the wrong number of arguments has been passed, the program does not specify what is wrong with an argument. For instance, if a person mistyped the source name, or if the person provided an invalid date (like a date in the future) the program would simply state that an argument is invalid. Better feedback would be to specify what is wrong with a provided input without the user having to consult the help page.
Prime Directive 3: Can an external developer successfully understand and enhance the system?
Being developers, the spirit of providing a functional system would also be to provide something that can be easily extended, providing even more function. Team Pichu has done the right thing when it comes to their code and provided their code for free on their Google Project Hosting site. All one has to do in order to enhance the system would be to the SVN revision control system. Google's functionality allows for the user to simply input 'svn checkout http://hale-aloha-cli-pichu.googlecode.com/svn/trunk/ hale-aloha-cli-pichu-read-only' into the command line to download the system into the user's current path. From then on, the DeveloperGuide provided by Team Pichu provides simple instructions to build the system, their build tools they've used, and how to generate the documentation for the system, though none on extending it (importing it into Eclipse, for instance.)
Upon importing it into Eclipse, a dependency for the WattDepotClient is broken, as well. The .settings file provided by the team for use in Eclipse provided a literal classpath instead of a local classpath relative to the project's root. As such, the build path specified by the system was not the same on my system, making compilation and extension impossible. Luckily, the problem could be remedied by specifying the jar to be relative to the project, which would solve it for all environments.
The code itself is thoroughly documented with Javadoc, though there is little explanation as to the strategy implemented. For instance, even after generating and reading through the javadoc, I'm still not sure how the Command classes determine a valid source (it seems like it uses regex which generally isn't very readable or understandable to anyone who didn't implement it) or why they use the Date and SimpleDateFormat classes to check and generate timestamps.
The Developer Guide provides excellent instructions to continue their method of quality assurance. Their method involved Automated Quality Assurance tools CheckStyle, PMD and Findbugs which was activated with their provided ant build file verify.build.xml. Locally they emphasized the importance of using the provided tools to ensure a standardized, working code. They've also provided for a server using the Jenkins quality assurance tool to monitor their repository. Jenkins, using the same ant build file, monitors the state of the system and emails the developers in charge if any part of the verification system fails. This would allow members to quickly troubleshoot and restore the system to a working state in as little time as possible (looking at the previous changes to the system so far, each problem was fixed in roughly 20 minutes, which is better than, say, changing the system then running it later that day only to realize that it doesn't work.
The developer guide also emphasizes to continue the correctness of the program using JUnit test cases. Looking at the test cases and running the included jacoco build file, the team extensively tests the various commands using both good and bad input, resulting in a well-tested system with 85% of the instructions tested.
The team also had used Google Project Hosting's Issue feature to document the build process and relate each change to the source to a specific Issue. Looking at the issue and previous commits, it seems that two out of three of the team members had implemented most of the code, but did so in a way that is easy to see what changes were made and what issues related to it.
In all, even though some documentation were lacking and through a little troubleshooting, the system seems to be able to be extensible.
Tuesday, November 29, 2011
Issue Driven Management, Cooperation and the WattDepot CLI
There's little use to a good API if there isn't an easy way to interact with it. Enter the most l33t way of interacting with computers, the command-line interface. Having stood the test of time (I still find SVN management via the command-line to be the best way to do so) we were tasked to implement a CLI for the WattDepot server. And by "we", I mean the newly formed team of David Wilkie, Yong Hong Hsu and myself, or Team Cybernetic Cucumber (cycuc, for short.) To try to streamline development and ensure that not just one person is tasked with doing everything, we were encouraged (with our grades no less) to incorporate Issue-driven Development, Continuous Integration and Google Project Hosting into our development process. The product of which can be found at http://code.google.com/p/hale-aloha-cli-cycuc/.
The Client was the main loop that interacts with the user. As such, it prompts the user to input a command and passes it to the subclasses depending on the command given. David was genius to implement an Enum to handle both the Operations as we well as the arguments, which made passing data to both of the subclasses extremely easy.
Once done with determining the client, the rest of the string passed to the Processor class to determine if the sources and timestamps are valid. Source validation was implemented by using the getSource(String string) method from the WattDepot API and timestamps were validated simply by checking if the string given was in the form of YYYY-MM-DD and that all values were valid to get information from the WattDepot Server. The Source and XMLGregorianCalendars required by the queries to the WattDepot server were then stored locally and passed to the main Client based on what was needed.
The Command class contains an interface with one method that contains one method, printResults, that all Commands are to implement. The interface ensured a common method that one could use to extend the system. Each class queries the client and prints out the results of that query. The following Commands were implemented:
Each Command comes with a JUnit test that provides 94% coverage, meaning that most of the instructions are tested.
There are several flaws with the implementation as it stands now. Reporting errors to the user is hit and miss since we use ordinary Exception classes to throw with custom messages, which may or may not be descriptive. A more thorough implementation would incorporate custom Exceptions that would be thrown for a respective error (Invalid source, timestamp, timed out, etc). Another bug (though it could be seen as a feature) is that the sources aren't cleared when an error occurs, so if one were to input, say, "current-power Lehua", the Lehua source will stay in the processor until it's changed. This makes it easier for the user to make multiple queries to the same source without specifying it all the time, but also means for weird actions if a source is wrong (if one were to input "current-power foo" after the last command, it would throw an error and print out the energy usage for Lehua again, for instance.) More rigorous testing and reviews would certainly be beneficial for the system.
In all, it was a good challenge to work with a group for project development. Throughout our scholastic career, we were expected to churn out code on our own and by doing so we never really learn how to write for anyone but ourselves. With some help with Automatic Quality Assurance and Issue Driven Development, I've learned a few ways that I could use in the future when I'm not just developing for myself and a grade, but for a system that will be seen and modified by others both in this current development cycle as well as future ones.
Issue Driven Management
The most important new aspect of this development was the concept of Issue-driven Development. Instead of verbal planning, the team outlines Issues for tasks to be done, whether it be debugging, extending/enhancing the system, or just plain doing the documentation. In this digital age, to have anything less descriptive and concrete is a detriment to efficiency. Google Project Hosting incorporates Issues right into their client, making it easy for any person to file an Issue with the system and for the owners to comment and coordinate easily to make sure of its resolution. In terms of the project, we split up tasks with Issues, each of which corresponded to a task for the system. The way we split up the work was for me to focus on the Processor and DailyEnergy command, David to focus on the main CLI, documentation and the CurrentPower command, and for Yong Hong to focus on the RankTowers and EnergySince command. This is easily reflected in the Issues page, (with added Issues corresponding to defects throughout the development process.) All in all, I definitely would recommend Issue-Driven Management to any project with any amount of coordination; text is a great medium especially for coordination (even though the Internet is distracting and impersonal.)The WattDepot CLI
Our command-line was a simple one implemented in Java. Consisting of a tiered hierarchy, it incorporates a User Interface, several classes that corresponds to queries to the WattDepot server, and a helper class to parse arguments. The hierarchy is listed as such:
| edu.hawaii.halealohacli.Client
|-> edu.hawaii.halealoha.Processor
|-> edu.hawaii.halealoha.Command
The Client was the main loop that interacts with the user. As such, it prompts the user to input a command and passes it to the subclasses depending on the command given. David was genius to implement an Enum to handle both the Operations as we well as the arguments, which made passing data to both of the subclasses extremely easy.
Once done with determining the client, the rest of the string passed to the Processor class to determine if the sources and timestamps are valid. Source validation was implemented by using the getSource(String string) method from the WattDepot API and timestamps were validated simply by checking if the string given was in the form of YYYY-MM-DD and that all values were valid to get information from the WattDepot Server. The Source and XMLGregorianCalendars required by the queries to the WattDepot server were then stored locally and passed to the main Client based on what was needed.
The Command class contains an interface with one method that contains one method, printResults, that all Commands are to implement. The interface ensured a common method that one could use to extend the system. Each class queries the client and prints out the results of that query. The following Commands were implemented:
current-energy [source]: The current energy usage of the given source.
energy-since [source] [date]: The energy usage of the source from the date provided to now.
daily-energy [source] [date]: The total amount of energy used by the source on that day.
rank-towers [date] [date]: List out the energy sources in ascending order by the amount of energy used between the two dates.
Each Command comes with a JUnit test that provides 94% coverage, meaning that most of the instructions are tested.
There are several flaws with the implementation as it stands now. Reporting errors to the user is hit and miss since we use ordinary Exception classes to throw with custom messages, which may or may not be descriptive. A more thorough implementation would incorporate custom Exceptions that would be thrown for a respective error (Invalid source, timestamp, timed out, etc). Another bug (though it could be seen as a feature) is that the sources aren't cleared when an error occurs, so if one were to input, say, "current-power Lehua", the Lehua source will stay in the processor until it's changed. This makes it easier for the user to make multiple queries to the same source without specifying it all the time, but also means for weird actions if a source is wrong (if one were to input "current-power foo" after the last command, it would throw an error and print out the energy usage for Lehua again, for instance.) More rigorous testing and reviews would certainly be beneficial for the system.
In all, it was a good challenge to work with a group for project development. Throughout our scholastic career, we were expected to churn out code on our own and by doing so we never really learn how to write for anyone but ourselves. With some help with Automatic Quality Assurance and Issue Driven Development, I've learned a few ways that I could use in the future when I'm not just developing for myself and a grade, but for a system that will be seen and modified by others both in this current development cycle as well as future ones.
Tuesday, November 8, 2011
Energy, APIs and the Internet.
As mentioned in my last post, we can only go so far in regards to producing enough energy from alternative resources. In this power-hungry world, it's an unfortunate truth. The fortunate truth is that the state of technology has grown exponentially in the past century and we now have the power to look at our energy consumption in ways like never before. Simple meters can be installed and maintained to keep track of how much energy we use, meaning that we can know what we need to change in order to use less energy.
The WattDepot Client is one such technology. Produced by Professor Philip Johnson of University of Hawaii at Manoa, WattDepot is easily configurable to accept input from a variety of energy meters and relay it to those who need to analyze them. As a proof, the current implementation keeps track of energy in one of the most chaotic environments known to man: the college dorm. All kidding aside, the result of such a system is an system that can easily be queried, returning up-to-the-minute information about the energy consumption and production of a particular area.
Using the WattDepot Client API is made incredibly easy. Since it uses the xml-based REST protocol, one can easily use any xml-parsing language to read and generate data. An already-working implementation (in Java) could be found at the wattdepot-simpleapp Google Code page. The included file is a shell app that simply gets the first source returned by the client and calculates its latency. One can easily consult the WattDepot API to find out how to request data from the sources. The basics of it is in the Source and SensorData classes. Each source holds data about its Energy (mW produced/consumed) and Power (energy / time).
One can attain the data of a particular source by passing in the source name and a timestamp (the manipulation of which the included Tstamp class under utils is incredibly helpful with) getting a precise measurement from the source. The result is a huge resource of data that can be queried and sorted according to its energy use, which is used by the Kukui Cup to keep track of which dorm area is most eco-friendly. This technology can be easily applicable to any area be it residential, collegiate or business, meaning that we as a human race can effectively know what and where our energy consumption is going to and how we can change for the better.
The WattDepot Client is one such technology. Produced by Professor Philip Johnson of University of Hawaii at Manoa, WattDepot is easily configurable to accept input from a variety of energy meters and relay it to those who need to analyze them. As a proof, the current implementation keeps track of energy in one of the most chaotic environments known to man: the college dorm. All kidding aside, the result of such a system is an system that can easily be queried, returning up-to-the-minute information about the energy consumption and production of a particular area.
Energy data manipulation made easy
Using the WattDepot Client API is made incredibly easy. Since it uses the xml-based REST protocol, one can easily use any xml-parsing language to read and generate data. An already-working implementation (in Java) could be found at the wattdepot-simpleapp Google Code page. The included file is a shell app that simply gets the first source returned by the client and calculates its latency. One can easily consult the WattDepot API to find out how to request data from the sources. The basics of it is in the Source and SensorData classes. Each source holds data about its Energy (mW produced/consumed) and Power (energy / time).
One can attain the data of a particular source by passing in the source name and a timestamp (the manipulation of which the included Tstamp class under utils is incredibly helpful with) getting a precise measurement from the source. The result is a huge resource of data that can be queried and sorted according to its energy use, which is used by the Kukui Cup to keep track of which dorm area is most eco-friendly. This technology can be easily applicable to any area be it residential, collegiate or business, meaning that we as a human race can effectively know what and where our energy consumption is going to and how we can change for the better.
Tuesday, November 1, 2011
Hawaii's Unique Energy Situation
Hawai'i is the most isolated piece of land in the world. You can go to outer space 8 times and back in the same distance that it takes for you to go from here to the nearest major land-mass. It would come to no surprise that it takes a considerable amount of effort to get things here. Whether by plane or by ship, the cost it takes is reflected in our cost of living. Since we as an island have a limited amount of resources in coal and oil, we have to import the majority of our energy from elsewhere, resulting in energy costs double or triple the cost of the Mainland. As such, we have a growing incentive to partake in converting our lifestyle to that of which is a more energy-efficient one.
As oil prices rise, this initiative is starting to become a more and more feasible one. So much so that ex-Governor of Hawaii, Linda Lingle, signed a bill to make 70% of our current energy needs renewable by 2030. This entails becoming 30% more energy-efficient while converting 40% of our current energy needs to that of renewable energy. While this may seem like a lofty goal, our unique location and resources, as well as the ever-increasing state of technology, make it seem like a realistic one.
Being in the middle of the ocean, we don't have much in the way of traditional carbon-based fuel. We do, however, have a majority of renewable energy that has been popularized the past decade. Solar, wind, wave, geothermal, you name it, we have it. As the price in investment in these technologies goes down, along with the rise of oil prices, we have a noticeable and realistic incentive to convert our energy into renewable energy. As a state, we have a already started to invest in wind farms, solar energy as well as research into alternative energy like Kukui Nut Oil, that makes us well on the way to renewable energy.
Being a new investment, our renewable energy would only amount to so much of our current energy needs. Our power plants generate far more electricity than the current efficiency of our renewable energy can cover. We, therefore, have to learn how to cut our energy needs to better make use of our renewable energy. This would come in the form of energy auditing and regulations. The majority of energy use in Hawai'i comes from the business and industrial sector. With current technology, a business can easily tell just how much energy electricity is being used and from where. A feasible plan for cutting our energy consumption would be to offer incentives, be it fines for excessive energy use or tax-cuts for sufficient reduction, to businesses to spur them to undergo energy audits and reconstruction. Whether by installing new technology or altering their energy policies, a business would stand to reason to use less energy under this plan.
These strategies would almost definitely cut our dependence on foreign oil and act as a blueprint for the rest of the world in transitioning to alternative energy.
As oil prices rise, this initiative is starting to become a more and more feasible one. So much so that ex-Governor of Hawaii, Linda Lingle, signed a bill to make 70% of our current energy needs renewable by 2030. This entails becoming 30% more energy-efficient while converting 40% of our current energy needs to that of renewable energy. While this may seem like a lofty goal, our unique location and resources, as well as the ever-increasing state of technology, make it seem like a realistic one.
Being in the middle of the ocean, we don't have much in the way of traditional carbon-based fuel. We do, however, have a majority of renewable energy that has been popularized the past decade. Solar, wind, wave, geothermal, you name it, we have it. As the price in investment in these technologies goes down, along with the rise of oil prices, we have a noticeable and realistic incentive to convert our energy into renewable energy. As a state, we have a already started to invest in wind farms, solar energy as well as research into alternative energy like Kukui Nut Oil, that makes us well on the way to renewable energy.
Being a new investment, our renewable energy would only amount to so much of our current energy needs. Our power plants generate far more electricity than the current efficiency of our renewable energy can cover. We, therefore, have to learn how to cut our energy needs to better make use of our renewable energy. This would come in the form of energy auditing and regulations. The majority of energy use in Hawai'i comes from the business and industrial sector. With current technology, a business can easily tell just how much energy electricity is being used and from where. A feasible plan for cutting our energy consumption would be to offer incentives, be it fines for excessive energy use or tax-cuts for sufficient reduction, to businesses to spur them to undergo energy audits and reconstruction. Whether by installing new technology or altering their energy policies, a business would stand to reason to use less energy under this plan.
These strategies would almost definitely cut our dependence on foreign oil and act as a blueprint for the rest of the world in transitioning to alternative energy.
Monday, October 24, 2011
Midterm
The following questions, I think would be good for a midterm:
1) What are the pros/cons of Code Review and Automated Code Assurance? Why do we need both?
2) What event would you need to override in order for your Robocode Robot to react appropriately to hitting a wall?
3) What is the main principle of The Principle of Least Astonishment (found in The Elements of Java Style)?
4) What feature was added to Java to make dealing with primitive data types easier?
5) What JUnit method would you use to test if an object is equal to its expected outcome?
Sounds pretty good if I could say myself.
1) What are the pros/cons of Code Review and Automated Code Assurance? Why do we need both?
2) What event would you need to override in order for your Robocode Robot to react appropriately to hitting a wall?
3) What is the main principle of The Principle of Least Astonishment (found in The Elements of Java Style)?
4) What feature was added to Java to make dealing with primitive data types easier?
5) What JUnit method would you use to test if an object is equal to its expected outcome?
Sounds pretty good if I could say myself.
Thursday, October 20, 2011
Version Management
The only thing that's constant is change. That holds for life and for programming, as well. As programs get more and more complex, the more they need to be maintained. Even if you debug your code and release a perfect product, it's likely that you will want to make a change sometime in the future, a new feature or an overlooked security issue, for instance. Version Control is akin to an assistant that keeps track of every change you make to your code to assure that each change you make to your code is logged and kept track of. Acting like save points in a videogame, version control makes it obvious what changed and allows you to revert your code to a previous version if needed.
As a requirement for one of my classes, we had to use Subversion along with Google Code to host our robocode project (the result can be found here.) Having experimented with Git while working with Heroku, Subversion came intuitively to me. Both have command-line and gui interfaces, though with linux the command-line is really the best way to go about your files. With Subversion, you can make changes to your local directory that subversion will note to change on your remote server. It's easy as 'svn commit' when it comes to updating your source. To upload your code to a remote server is easy, too, as 'svn import URL' allows you to import your base directory to the path, assuming that the path is configured for subversion management.
The only thing I miss from git is an easy-to-find ignore list like .gitignore, but for the purposes of uploading a robocode robot to google project, interfacing directly with the folder proved just fine (though it resulted in way more versions than I had planned.) Other than that, managing versions of your software is easy as import, edit, commit.
As a requirement for one of my classes, we had to use Subversion along with Google Code to host our robocode project (the result can be found here.) Having experimented with Git while working with Heroku, Subversion came intuitively to me. Both have command-line and gui interfaces, though with linux the command-line is really the best way to go about your files. With Subversion, you can make changes to your local directory that subversion will note to change on your remote server. It's easy as 'svn commit' when it comes to updating your source. To upload your code to a remote server is easy, too, as 'svn import URL' allows you to import your base directory to the path, assuming that the path is configured for subversion management.
The only thing I miss from git is an easy-to-find ignore list like .gitignore, but for the purposes of uploading a robocode robot to google project, interfacing directly with the folder proved just fine (though it resulted in way more versions than I had planned.) Other than that, managing versions of your software is easy as import, edit, commit.
Subscribe to:
Posts (Atom)