Taking upon another group's code is no easy task. The difficulty can range from perplexing to hair-pulling. Thankfully, taking on Team Pichu's implementation of a WattDepot Command Line fell into the former category (our friends at Team Teams, who inherited the system previously built by us at Team Cycuc unfortunately fell into the latter category.) As previously blogged about, the level of extensibility is made through open-sourcing one's code and providing thorough documentation. Another important part of making code easily extensible is simple, comprehensible code, which Team Pichu has provided. The process of extending it was no harder than creating a new Command that extended the provided interface and adding it to their processor class (which used branching rather than enumerations, which, while less elegant and efficient, is easier to understand.) As such, the members at Team Cycuc: Yong Hong Hsu, David Wilkie and I successfully create Commands that extended Team Pichu's implementation, though issues with communication meant for an implementation that does not meet specifications.
In many ways, it was easier to work with an existing implementation since issue-driven management means that one can easily assign tasks via Issues in Google Project Hosting, but the problem with Issues is that just because they're there doesn't mean that they're correct and that communication isn't required. The problem with our implementation came from poor planning and coordination, and assumption that all commands do not depend on each other. If we as a team had looked at the specifications thoroughly, we'd know what each of us had to do at what time. Monitor Goal, assigned to Yong Hong Hsu, relies on set-baseline and monitor-power to successfully monitor whether a source achieves a goal, but fails to correctly integrate the methods coded by David and I. In retrospect, my implementation wouldn't have worked as a monitoring-device since the power isn't automatically updated while MonitorPower.run() is running, though an Issue wasn't raised by Yong Hong nor was I contacted in any other method.
With this in mind, I can easily say that our build is a partial failure in terms of the first Prime Directive "Does the system accomplish a useful task?" since it provides partial functionality, as it sets baselines and monitors power, but doesn't allow for the user to monitor whether a tower lowers their energy consumption to a certain percentage of the baseline. It does, however, print out the power at a regular interval such that a user itself can monitor the power of a WattDepot tower.
The ease of which I had implemented MonitorPower however (it took less than a day of coding), shows that the second two Prime Directives are achieved, all credit to Team Pichu. In comparison to Team Teams reaction to taking over our code, their code was incredibly easy to understand due to its simple constructs, meaning that the KISS principal should be used always, especially if one expects to open-source their project to provide for more functionality from other users.
UPDATE(8:45 am): After much running-around-with-hair-on-fire coding, I've gotten monitor-goal and monitor-power to work roughly as specified. It can be found here , it's a last-minute-heroics edition of the interface, since that's what I had to do.
Wednesday, December 14, 2011
Friday, December 2, 2011
Technical Review: Hale-Aloha-CLI-Pichu
Technical Reviews are rigorous testing of a system, such that all aspects of the system are scrutinized, pointing out errors that the developers themselves wouldn't have noticed. This review will be of a group who had developed a command-line interface for the WattDepot server provided by Professor Philip Johnson, a task we were also recently charged with. As such, they were put under the same development conditions as us, incorporating Google Project Hosting and the Jenkins server found at http://dasha.ics.hawaii.edu:9859/ to use Continuous Integration and Issue Driven management.
The WattDepot project and server, for those unfamiliar with it, is a project started by Professor Philip Johnson for the continuous monitoring of power consumption in the dormitories on the campus of University of Hawaii at Manoa. The system makes use of multiple sensors installed throughout the dormintories, each keeping track of their individual section. This information is then provided by the user by an API implemented in Java. This API was used by Team Pichu to provide for the CLI being assessed in this review. Having such an easy interface, this would allow the user to easily assess the energy consumption of a specific area in a building. This would allow for corporations to conduct Energy Audits which is a major part in reducing our energy consumption and dependence on foreign oil, as mentioned in my previous blog post. As such, the task provided by the CLI is incredibly useful.
The command-line provided by Team Pichu is an excellent way of interacting with the WattDepot API. Although many programs nowadays make use of GUIs, command-lines are still an efficient way to control a program. In this case, we are provided with a .jar file that can be easily run via the command-line with java's -jar option. The client is fairly easy to understand from the user's standpoint, especially with the help of the documentation on Pichu's homepage as seen in this screenshot.
Having such easy documentation allows the user to quickly understand how to query and get meaningful data from the WattDepot servers. After downloading the distribution from their site, I quickly got all four commands running within a minute:
Being developers, the spirit of providing a functional system would also be to provide something that can be easily extended, providing even more function. Team Pichu has done the right thing when it comes to their code and provided their code for free on their Google Project Hosting site. All one has to do in order to enhance the system would be to the SVN revision control system. Google's functionality allows for the user to simply input 'svn checkout http://hale-aloha-cli-pichu.googlecode.com/svn/trunk/ hale-aloha-cli-pichu-read-only' into the command line to download the system into the user's current path. From then on, the DeveloperGuide provided by Team Pichu provides simple instructions to build the system, their build tools they've used, and how to generate the documentation for the system, though none on extending it (importing it into Eclipse, for instance.)
Upon importing it into Eclipse, a dependency for the WattDepotClient is broken, as well. The .settings file provided by the team for use in Eclipse provided a literal classpath instead of a local classpath relative to the project's root. As such, the build path specified by the system was not the same on my system, making compilation and extension impossible. Luckily, the problem could be remedied by specifying the jar to be relative to the project, which would solve it for all environments.
The code itself is thoroughly documented with Javadoc, though there is little explanation as to the strategy implemented. For instance, even after generating and reading through the javadoc, I'm still not sure how the Command classes determine a valid source (it seems like it uses regex which generally isn't very readable or understandable to anyone who didn't implement it) or why they use the Date and SimpleDateFormat classes to check and generate timestamps.
The Developer Guide provides excellent instructions to continue their method of quality assurance. Their method involved Automated Quality Assurance tools CheckStyle, PMD and Findbugs which was activated with their provided ant build file verify.build.xml. Locally they emphasized the importance of using the provided tools to ensure a standardized, working code. They've also provided for a server using the Jenkins quality assurance tool to monitor their repository. Jenkins, using the same ant build file, monitors the state of the system and emails the developers in charge if any part of the verification system fails. This would allow members to quickly troubleshoot and restore the system to a working state in as little time as possible (looking at the previous changes to the system so far, each problem was fixed in roughly 20 minutes, which is better than, say, changing the system then running it later that day only to realize that it doesn't work.
The developer guide also emphasizes to continue the correctness of the program using JUnit test cases. Looking at the test cases and running the included jacoco build file, the team extensively tests the various commands using both good and bad input, resulting in a well-tested system with 85% of the instructions tested.
The team also had used Google Project Hosting's Issue feature to document the build process and relate each change to the source to a specific Issue. Looking at the issue and previous commits, it seems that two out of three of the team members had implemented most of the code, but did so in a way that is easy to see what changes were made and what issues related to it.
In all, even though some documentation were lacking and through a little troubleshooting, the system seems to be able to be extensible.
Prime Directive 1: Does the system accomplish a useful task?
The WattDepot project and server, for those unfamiliar with it, is a project started by Professor Philip Johnson for the continuous monitoring of power consumption in the dormitories on the campus of University of Hawaii at Manoa. The system makes use of multiple sensors installed throughout the dormintories, each keeping track of their individual section. This information is then provided by the user by an API implemented in Java. This API was used by Team Pichu to provide for the CLI being assessed in this review. Having such an easy interface, this would allow the user to easily assess the energy consumption of a specific area in a building. This would allow for corporations to conduct Energy Audits which is a major part in reducing our energy consumption and dependence on foreign oil, as mentioned in my previous blog post. As such, the task provided by the CLI is incredibly useful.
Prime Directive 2: Can an external user can successfully install and use the system?
The command-line provided by Team Pichu is an excellent way of interacting with the WattDepot API. Although many programs nowadays make use of GUIs, command-lines are still an efficient way to control a program. In this case, we are provided with a .jar file that can be easily run via the command-line with java's -jar option. The client is fairly easy to understand from the user's standpoint, especially with the help of the documentation on Pichu's homepage as seen in this screenshot.
Having such easy documentation allows the user to quickly understand how to query and get meaningful data from the WattDepot servers. After downloading the distribution from their site, I quickly got all four commands running within a minute:
Besides the help page, however, the feedback provided to the user when given bad input is lacking. Outside of notifying the user that the wrong number of arguments has been passed, the program does not specify what is wrong with an argument. For instance, if a person mistyped the source name, or if the person provided an invalid date (like a date in the future) the program would simply state that an argument is invalid. Better feedback would be to specify what is wrong with a provided input without the user having to consult the help page.
Prime Directive 3: Can an external developer successfully understand and enhance the system?
Being developers, the spirit of providing a functional system would also be to provide something that can be easily extended, providing even more function. Team Pichu has done the right thing when it comes to their code and provided their code for free on their Google Project Hosting site. All one has to do in order to enhance the system would be to the SVN revision control system. Google's functionality allows for the user to simply input 'svn checkout http://hale-aloha-cli-pichu.googlecode.com/svn/trunk/ hale-aloha-cli-pichu-read-only' into the command line to download the system into the user's current path. From then on, the DeveloperGuide provided by Team Pichu provides simple instructions to build the system, their build tools they've used, and how to generate the documentation for the system, though none on extending it (importing it into Eclipse, for instance.)
Upon importing it into Eclipse, a dependency for the WattDepotClient is broken, as well. The .settings file provided by the team for use in Eclipse provided a literal classpath instead of a local classpath relative to the project's root. As such, the build path specified by the system was not the same on my system, making compilation and extension impossible. Luckily, the problem could be remedied by specifying the jar to be relative to the project, which would solve it for all environments.
The code itself is thoroughly documented with Javadoc, though there is little explanation as to the strategy implemented. For instance, even after generating and reading through the javadoc, I'm still not sure how the Command classes determine a valid source (it seems like it uses regex which generally isn't very readable or understandable to anyone who didn't implement it) or why they use the Date and SimpleDateFormat classes to check and generate timestamps.
The Developer Guide provides excellent instructions to continue their method of quality assurance. Their method involved Automated Quality Assurance tools CheckStyle, PMD and Findbugs which was activated with their provided ant build file verify.build.xml. Locally they emphasized the importance of using the provided tools to ensure a standardized, working code. They've also provided for a server using the Jenkins quality assurance tool to monitor their repository. Jenkins, using the same ant build file, monitors the state of the system and emails the developers in charge if any part of the verification system fails. This would allow members to quickly troubleshoot and restore the system to a working state in as little time as possible (looking at the previous changes to the system so far, each problem was fixed in roughly 20 minutes, which is better than, say, changing the system then running it later that day only to realize that it doesn't work.
The developer guide also emphasizes to continue the correctness of the program using JUnit test cases. Looking at the test cases and running the included jacoco build file, the team extensively tests the various commands using both good and bad input, resulting in a well-tested system with 85% of the instructions tested.
The team also had used Google Project Hosting's Issue feature to document the build process and relate each change to the source to a specific Issue. Looking at the issue and previous commits, it seems that two out of three of the team members had implemented most of the code, but did so in a way that is easy to see what changes were made and what issues related to it.
In all, even though some documentation were lacking and through a little troubleshooting, the system seems to be able to be extensible.
Tuesday, November 29, 2011
Issue Driven Management, Cooperation and the WattDepot CLI
There's little use to a good API if there isn't an easy way to interact with it. Enter the most l33t way of interacting with computers, the command-line interface. Having stood the test of time (I still find SVN management via the command-line to be the best way to do so) we were tasked to implement a CLI for the WattDepot server. And by "we", I mean the newly formed team of David Wilkie, Yong Hong Hsu and myself, or Team Cybernetic Cucumber (cycuc, for short.) To try to streamline development and ensure that not just one person is tasked with doing everything, we were encouraged (with our grades no less) to incorporate Issue-driven Development, Continuous Integration and Google Project Hosting into our development process. The product of which can be found at http://code.google.com/p/hale-aloha-cli-cycuc/.
The Client was the main loop that interacts with the user. As such, it prompts the user to input a command and passes it to the subclasses depending on the command given. David was genius to implement an Enum to handle both the Operations as we well as the arguments, which made passing data to both of the subclasses extremely easy.
Once done with determining the client, the rest of the string passed to the Processor class to determine if the sources and timestamps are valid. Source validation was implemented by using the getSource(String string) method from the WattDepot API and timestamps were validated simply by checking if the string given was in the form of YYYY-MM-DD and that all values were valid to get information from the WattDepot Server. The Source and XMLGregorianCalendars required by the queries to the WattDepot server were then stored locally and passed to the main Client based on what was needed.
The Command class contains an interface with one method that contains one method, printResults, that all Commands are to implement. The interface ensured a common method that one could use to extend the system. Each class queries the client and prints out the results of that query. The following Commands were implemented:
Each Command comes with a JUnit test that provides 94% coverage, meaning that most of the instructions are tested.
There are several flaws with the implementation as it stands now. Reporting errors to the user is hit and miss since we use ordinary Exception classes to throw with custom messages, which may or may not be descriptive. A more thorough implementation would incorporate custom Exceptions that would be thrown for a respective error (Invalid source, timestamp, timed out, etc). Another bug (though it could be seen as a feature) is that the sources aren't cleared when an error occurs, so if one were to input, say, "current-power Lehua", the Lehua source will stay in the processor until it's changed. This makes it easier for the user to make multiple queries to the same source without specifying it all the time, but also means for weird actions if a source is wrong (if one were to input "current-power foo" after the last command, it would throw an error and print out the energy usage for Lehua again, for instance.) More rigorous testing and reviews would certainly be beneficial for the system.
In all, it was a good challenge to work with a group for project development. Throughout our scholastic career, we were expected to churn out code on our own and by doing so we never really learn how to write for anyone but ourselves. With some help with Automatic Quality Assurance and Issue Driven Development, I've learned a few ways that I could use in the future when I'm not just developing for myself and a grade, but for a system that will be seen and modified by others both in this current development cycle as well as future ones.
Issue Driven Management
The most important new aspect of this development was the concept of Issue-driven Development. Instead of verbal planning, the team outlines Issues for tasks to be done, whether it be debugging, extending/enhancing the system, or just plain doing the documentation. In this digital age, to have anything less descriptive and concrete is a detriment to efficiency. Google Project Hosting incorporates Issues right into their client, making it easy for any person to file an Issue with the system and for the owners to comment and coordinate easily to make sure of its resolution. In terms of the project, we split up tasks with Issues, each of which corresponded to a task for the system. The way we split up the work was for me to focus on the Processor and DailyEnergy command, David to focus on the main CLI, documentation and the CurrentPower command, and for Yong Hong to focus on the RankTowers and EnergySince command. This is easily reflected in the Issues page, (with added Issues corresponding to defects throughout the development process.) All in all, I definitely would recommend Issue-Driven Management to any project with any amount of coordination; text is a great medium especially for coordination (even though the Internet is distracting and impersonal.)The WattDepot CLI
Our command-line was a simple one implemented in Java. Consisting of a tiered hierarchy, it incorporates a User Interface, several classes that corresponds to queries to the WattDepot server, and a helper class to parse arguments. The hierarchy is listed as such:
| edu.hawaii.halealohacli.Client
|-> edu.hawaii.halealoha.Processor
|-> edu.hawaii.halealoha.Command
The Client was the main loop that interacts with the user. As such, it prompts the user to input a command and passes it to the subclasses depending on the command given. David was genius to implement an Enum to handle both the Operations as we well as the arguments, which made passing data to both of the subclasses extremely easy.
Once done with determining the client, the rest of the string passed to the Processor class to determine if the sources and timestamps are valid. Source validation was implemented by using the getSource(String string) method from the WattDepot API and timestamps were validated simply by checking if the string given was in the form of YYYY-MM-DD and that all values were valid to get information from the WattDepot Server. The Source and XMLGregorianCalendars required by the queries to the WattDepot server were then stored locally and passed to the main Client based on what was needed.
The Command class contains an interface with one method that contains one method, printResults, that all Commands are to implement. The interface ensured a common method that one could use to extend the system. Each class queries the client and prints out the results of that query. The following Commands were implemented:
current-energy [source]: The current energy usage of the given source.
energy-since [source] [date]: The energy usage of the source from the date provided to now.
daily-energy [source] [date]: The total amount of energy used by the source on that day.
rank-towers [date] [date]: List out the energy sources in ascending order by the amount of energy used between the two dates.
Each Command comes with a JUnit test that provides 94% coverage, meaning that most of the instructions are tested.
There are several flaws with the implementation as it stands now. Reporting errors to the user is hit and miss since we use ordinary Exception classes to throw with custom messages, which may or may not be descriptive. A more thorough implementation would incorporate custom Exceptions that would be thrown for a respective error (Invalid source, timestamp, timed out, etc). Another bug (though it could be seen as a feature) is that the sources aren't cleared when an error occurs, so if one were to input, say, "current-power Lehua", the Lehua source will stay in the processor until it's changed. This makes it easier for the user to make multiple queries to the same source without specifying it all the time, but also means for weird actions if a source is wrong (if one were to input "current-power foo" after the last command, it would throw an error and print out the energy usage for Lehua again, for instance.) More rigorous testing and reviews would certainly be beneficial for the system.
In all, it was a good challenge to work with a group for project development. Throughout our scholastic career, we were expected to churn out code on our own and by doing so we never really learn how to write for anyone but ourselves. With some help with Automatic Quality Assurance and Issue Driven Development, I've learned a few ways that I could use in the future when I'm not just developing for myself and a grade, but for a system that will be seen and modified by others both in this current development cycle as well as future ones.
Tuesday, November 8, 2011
Energy, APIs and the Internet.
As mentioned in my last post, we can only go so far in regards to producing enough energy from alternative resources. In this power-hungry world, it's an unfortunate truth. The fortunate truth is that the state of technology has grown exponentially in the past century and we now have the power to look at our energy consumption in ways like never before. Simple meters can be installed and maintained to keep track of how much energy we use, meaning that we can know what we need to change in order to use less energy.
The WattDepot Client is one such technology. Produced by Professor Philip Johnson of University of Hawaii at Manoa, WattDepot is easily configurable to accept input from a variety of energy meters and relay it to those who need to analyze them. As a proof, the current implementation keeps track of energy in one of the most chaotic environments known to man: the college dorm. All kidding aside, the result of such a system is an system that can easily be queried, returning up-to-the-minute information about the energy consumption and production of a particular area.
Using the WattDepot Client API is made incredibly easy. Since it uses the xml-based REST protocol, one can easily use any xml-parsing language to read and generate data. An already-working implementation (in Java) could be found at the wattdepot-simpleapp Google Code page. The included file is a shell app that simply gets the first source returned by the client and calculates its latency. One can easily consult the WattDepot API to find out how to request data from the sources. The basics of it is in the Source and SensorData classes. Each source holds data about its Energy (mW produced/consumed) and Power (energy / time).
One can attain the data of a particular source by passing in the source name and a timestamp (the manipulation of which the included Tstamp class under utils is incredibly helpful with) getting a precise measurement from the source. The result is a huge resource of data that can be queried and sorted according to its energy use, which is used by the Kukui Cup to keep track of which dorm area is most eco-friendly. This technology can be easily applicable to any area be it residential, collegiate or business, meaning that we as a human race can effectively know what and where our energy consumption is going to and how we can change for the better.
The WattDepot Client is one such technology. Produced by Professor Philip Johnson of University of Hawaii at Manoa, WattDepot is easily configurable to accept input from a variety of energy meters and relay it to those who need to analyze them. As a proof, the current implementation keeps track of energy in one of the most chaotic environments known to man: the college dorm. All kidding aside, the result of such a system is an system that can easily be queried, returning up-to-the-minute information about the energy consumption and production of a particular area.
Energy data manipulation made easy
Using the WattDepot Client API is made incredibly easy. Since it uses the xml-based REST protocol, one can easily use any xml-parsing language to read and generate data. An already-working implementation (in Java) could be found at the wattdepot-simpleapp Google Code page. The included file is a shell app that simply gets the first source returned by the client and calculates its latency. One can easily consult the WattDepot API to find out how to request data from the sources. The basics of it is in the Source and SensorData classes. Each source holds data about its Energy (mW produced/consumed) and Power (energy / time).
One can attain the data of a particular source by passing in the source name and a timestamp (the manipulation of which the included Tstamp class under utils is incredibly helpful with) getting a precise measurement from the source. The result is a huge resource of data that can be queried and sorted according to its energy use, which is used by the Kukui Cup to keep track of which dorm area is most eco-friendly. This technology can be easily applicable to any area be it residential, collegiate or business, meaning that we as a human race can effectively know what and where our energy consumption is going to and how we can change for the better.
Tuesday, November 1, 2011
Hawaii's Unique Energy Situation
Hawai'i is the most isolated piece of land in the world. You can go to outer space 8 times and back in the same distance that it takes for you to go from here to the nearest major land-mass. It would come to no surprise that it takes a considerable amount of effort to get things here. Whether by plane or by ship, the cost it takes is reflected in our cost of living. Since we as an island have a limited amount of resources in coal and oil, we have to import the majority of our energy from elsewhere, resulting in energy costs double or triple the cost of the Mainland. As such, we have a growing incentive to partake in converting our lifestyle to that of which is a more energy-efficient one.
As oil prices rise, this initiative is starting to become a more and more feasible one. So much so that ex-Governor of Hawaii, Linda Lingle, signed a bill to make 70% of our current energy needs renewable by 2030. This entails becoming 30% more energy-efficient while converting 40% of our current energy needs to that of renewable energy. While this may seem like a lofty goal, our unique location and resources, as well as the ever-increasing state of technology, make it seem like a realistic one.
Being in the middle of the ocean, we don't have much in the way of traditional carbon-based fuel. We do, however, have a majority of renewable energy that has been popularized the past decade. Solar, wind, wave, geothermal, you name it, we have it. As the price in investment in these technologies goes down, along with the rise of oil prices, we have a noticeable and realistic incentive to convert our energy into renewable energy. As a state, we have a already started to invest in wind farms, solar energy as well as research into alternative energy like Kukui Nut Oil, that makes us well on the way to renewable energy.
Being a new investment, our renewable energy would only amount to so much of our current energy needs. Our power plants generate far more electricity than the current efficiency of our renewable energy can cover. We, therefore, have to learn how to cut our energy needs to better make use of our renewable energy. This would come in the form of energy auditing and regulations. The majority of energy use in Hawai'i comes from the business and industrial sector. With current technology, a business can easily tell just how much energy electricity is being used and from where. A feasible plan for cutting our energy consumption would be to offer incentives, be it fines for excessive energy use or tax-cuts for sufficient reduction, to businesses to spur them to undergo energy audits and reconstruction. Whether by installing new technology or altering their energy policies, a business would stand to reason to use less energy under this plan.
These strategies would almost definitely cut our dependence on foreign oil and act as a blueprint for the rest of the world in transitioning to alternative energy.
As oil prices rise, this initiative is starting to become a more and more feasible one. So much so that ex-Governor of Hawaii, Linda Lingle, signed a bill to make 70% of our current energy needs renewable by 2030. This entails becoming 30% more energy-efficient while converting 40% of our current energy needs to that of renewable energy. While this may seem like a lofty goal, our unique location and resources, as well as the ever-increasing state of technology, make it seem like a realistic one.
Being in the middle of the ocean, we don't have much in the way of traditional carbon-based fuel. We do, however, have a majority of renewable energy that has been popularized the past decade. Solar, wind, wave, geothermal, you name it, we have it. As the price in investment in these technologies goes down, along with the rise of oil prices, we have a noticeable and realistic incentive to convert our energy into renewable energy. As a state, we have a already started to invest in wind farms, solar energy as well as research into alternative energy like Kukui Nut Oil, that makes us well on the way to renewable energy.
Being a new investment, our renewable energy would only amount to so much of our current energy needs. Our power plants generate far more electricity than the current efficiency of our renewable energy can cover. We, therefore, have to learn how to cut our energy needs to better make use of our renewable energy. This would come in the form of energy auditing and regulations. The majority of energy use in Hawai'i comes from the business and industrial sector. With current technology, a business can easily tell just how much energy electricity is being used and from where. A feasible plan for cutting our energy consumption would be to offer incentives, be it fines for excessive energy use or tax-cuts for sufficient reduction, to businesses to spur them to undergo energy audits and reconstruction. Whether by installing new technology or altering their energy policies, a business would stand to reason to use less energy under this plan.
These strategies would almost definitely cut our dependence on foreign oil and act as a blueprint for the rest of the world in transitioning to alternative energy.
Monday, October 24, 2011
Midterm
The following questions, I think would be good for a midterm:
1) What are the pros/cons of Code Review and Automated Code Assurance? Why do we need both?
2) What event would you need to override in order for your Robocode Robot to react appropriately to hitting a wall?
3) What is the main principle of The Principle of Least Astonishment (found in The Elements of Java Style)?
4) What feature was added to Java to make dealing with primitive data types easier?
5) What JUnit method would you use to test if an object is equal to its expected outcome?
Sounds pretty good if I could say myself.
1) What are the pros/cons of Code Review and Automated Code Assurance? Why do we need both?
2) What event would you need to override in order for your Robocode Robot to react appropriately to hitting a wall?
3) What is the main principle of The Principle of Least Astonishment (found in The Elements of Java Style)?
4) What feature was added to Java to make dealing with primitive data types easier?
5) What JUnit method would you use to test if an object is equal to its expected outcome?
Sounds pretty good if I could say myself.
Thursday, October 20, 2011
Version Management
The only thing that's constant is change. That holds for life and for programming, as well. As programs get more and more complex, the more they need to be maintained. Even if you debug your code and release a perfect product, it's likely that you will want to make a change sometime in the future, a new feature or an overlooked security issue, for instance. Version Control is akin to an assistant that keeps track of every change you make to your code to assure that each change you make to your code is logged and kept track of. Acting like save points in a videogame, version control makes it obvious what changed and allows you to revert your code to a previous version if needed.
As a requirement for one of my classes, we had to use Subversion along with Google Code to host our robocode project (the result can be found here.) Having experimented with Git while working with Heroku, Subversion came intuitively to me. Both have command-line and gui interfaces, though with linux the command-line is really the best way to go about your files. With Subversion, you can make changes to your local directory that subversion will note to change on your remote server. It's easy as 'svn commit' when it comes to updating your source. To upload your code to a remote server is easy, too, as 'svn import URL' allows you to import your base directory to the path, assuming that the path is configured for subversion management.
The only thing I miss from git is an easy-to-find ignore list like .gitignore, but for the purposes of uploading a robocode robot to google project, interfacing directly with the folder proved just fine (though it resulted in way more versions than I had planned.) Other than that, managing versions of your software is easy as import, edit, commit.
As a requirement for one of my classes, we had to use Subversion along with Google Code to host our robocode project (the result can be found here.) Having experimented with Git while working with Heroku, Subversion came intuitively to me. Both have command-line and gui interfaces, though with linux the command-line is really the best way to go about your files. With Subversion, you can make changes to your local directory that subversion will note to change on your remote server. It's easy as 'svn commit' when it comes to updating your source. To upload your code to a remote server is easy, too, as 'svn import URL' allows you to import your base directory to the path, assuming that the path is configured for subversion management.
The only thing I miss from git is an easy-to-find ignore list like .gitignore, but for the purposes of uploading a robocode robot to google project, interfacing directly with the folder proved just fine (though it resulted in way more versions than I had planned.) Other than that, managing versions of your software is easy as import, edit, commit.
Tuesday, October 11, 2011
Robocode: Lessthan20charsyeah
Due to time constraints I shall skip right to the nitty-gritty:
- Crazy (5/5)
- Fire (5/5)
- MyFirstJuniorRobot(3/5)
- MyFirstRobot(4/5)
- RamFire(4/5)
- SpinBot(4/5)
My goal for the robot was to keep away from close robots and to dodge bullets. As such, I picked two robots to do acceptance testing on, Corners, since it picks a corner tracks a robot, and RamFire, since it tries to track and get as close as possible to the robot as possible. Lessthan20charsyeah has an acceptance rate of greater than 80% (more often than not it is 90%) which, in the my view, is sufficient enough for a competitive robot.
Design
I wanted my robot to keep its distance while trying to track its opponent. Upon being hit with a bullet, it tries to avoid it by alternating movements. What resulted was something like a cross between Walls, Corners and MyFirstJuniorRobot. It scales the walls until its at a certain distance from the enemy, then it scans and tries to shoot at the enemy. If it gets hit by a robot, it turns inwards and travels far, and if it gets hit with a bullet, it tries to avoid it by zig-zagging or going further up the wall, depending on which stage its at.Robots that Lessthan20charsyeah can regularly beat (in a test of 5 rounds)
- Corners (5/5)- Crazy (5/5)
- Fire (5/5)
- MyFirstJuniorRobot(3/5)
- MyFirstRobot(4/5)
- RamFire(4/5)
- SpinBot(4/5)
Testing:
- Acceptance Testing:My goal for the robot was to keep away from close robots and to dodge bullets. As such, I picked two robots to do acceptance testing on, Corners, since it picks a corner tracks a robot, and RamFire, since it tries to track and get as close as possible to the robot as possible. Lessthan20charsyeah has an acceptance rate of greater than 80% (more often than not it is 90%) which, in the my view, is sufficient enough for a competitive robot.
Behavior Testing
Due to time constraints I was able only to test the robot's ability to keep ones distance. I calculated the distance between the robot and RamFire for a typical battle. I tested to see if my robot was able to keep a distance of more than 200 for more than 70% of the time and it came out to be successful. Such, it does an acceptable job of keeping its distance.What I learned:
I learned the importance of testing and documenting. More specifically, that no matter how imperfect a product is, it pays off to test often and test early. Testing takes up a lot of time and such one shouldn't try to fine tune a product and leave no time for testing. The documentation would be incomplete, giving the client less than perfect confidence in the correctness of the product.Thursday, September 29, 2011
Ant: A common build system for Java deployment.
"You say: toe-MAY-toe..."
There often comes a time that language barriers are a difficulty in real life. Helping travelers find their way to the nearest mall is one example. With the ever increasingly fragmented state of computing, language barriers can cause problems, too. There are three main OSs that have their strange idiosyncrasies, Linux, Mac OS X and Microsoft Windows. Within these systems are differences in file structure, new-line symbols and a plethora of other things which make deployment a hassle. With build systems, it allows for a programmer to reach the most amount of people, automating the process to make code deployment a breeze.
Introducing Ant
Like its GNU couterpart, make, Ant provides a framework for interacting with the system to build a provided system. What Ant does is act like a translator between the system you provide to the user and the user's computer. This means that no matter what OS they're using, Ant would be able to configure your system to work with theirs. Using XML, all one needs to do is provide simple targets that Ant has to execute. Using Java, ant could do anything from compiling:To archiving a system:
All through xml. All the user has to do is invoke ant on the xml file and they're done. Through automating the process, the user of the code doesn't have to go through the headache of trying to figure out what one needs to download/configure in order to use a great piece of code, and the group that provided the code can rest assured that their code will be the best to use.
Tuesday, September 20, 2011
Robocode: Productive Entertainment
Human beings are great with tools, it's a proven fact. Otherwise we wouldn't have built great empires and technologies during our short time here on Earth. With the advancements of technologies, very many people weren't delegated to long arduous hours of physical labor. Thus came about how we proved ourselves to be good at using tools not only to achieve tasks, but to use them in ways completely unintended for our personal entertainment. Take, for example, sword swallowing. Swords/knives are great for everything from chopping down brush, carving up pigs and stabbing your enemies, but just how much downtime do you need to be that bored to think, "hmm, I bet could swallow that razor-sharp sword no problem"? Bets, dares and challenges gave way to all that's popular, now. We use things like racquets and bats to hit things not to build something, but to see how hard we can hit it. We use gloves not handle things with care but to catch things to beat the other team.
Fast forward to the modern-era and we have not only physical tools but digital tools. Our entire economy is now (for better or for worse) fortified with the backbone of the Internet. The world is ever-shrinking and computers are assisting in computations in everything from business to protein folding. In those terms, we also have games that incorporate productivity skills into entertainment. In terms of contests, battles between algorithms are one of the most popular. Algorithms can mean that a company pulls in more money in the real world or finds a solution the first, but in terms of entertainment it would make for more explosions that result in points. Here is where we introduce Robocode, a game programmed in Java in which we program simulated robots to do our bidding, namely kill the other robots in the field.
Robocode was first produced by IBM, but support for the game was later dropped. The code, however, was made open-source and is maintained by hobbyists that I would guess, like the sword-swallowers, have a lot of downtime on their hands. When that downtime and effort results in a game as entertaining as Robocode, however, I don't see a problem with that. Robocode is an extremely easy way of building simulated battle robots that you can then put into a ring together for a battle to the death. All you need to do is download the site freely available from sourceforge run the jar install file and use the provided programs to compile and run your robots. The Robot class provided gives you all the functionality you need to make a competitive robots, namely their three basic functions: moving around the battlefield, scanning for other robots and firing their guns.
Professor Philip Johnson proposes that in order for one to become proficient at the sport of Robocode, as one often does in any other sport, one should submit oneself to learn and practice the basic functions of Robocode in order to increase your proficiency at both programming and the understanding of the game. His form of programming katas incorporated learning how to move to certain parts of the field and how to target/follow certain enemies. I found that learning to do these gave me ideas that I wouldn't have had if I started to kill robots on my own.
For instance, the idea of keeping distance from a robot in one Kata proved difficult since I kept running into walls as I backed up, essentially causing my robot to be trapped by the enemy robot ready to be blown up. My idea for keeping distance, however, by reversing perpendicularly off the wall looked like an effective way to keep distance, even if for a brief time you're going towards the enemy robot, you'll probably be better off in the long run and your gun is still faced at him.
Another challenge I had was figuring out, by myself, how to calculate angles for my robot to travel, not just in x then y, but in diagonals. What I believed would make me a better programmer, and one of the basic functions of robot katas, is to challenge myself and solve the problem myself, without open-source samples. This shows in my code since I have no doubt it's anything but optimal. However, I successfully learned to travel directly to the center using Javas atan function with the difference in coordinates between the robot and the point I wanted to go to. This having come after much trial-and-error, I believe I have a better understanding of the coordinate system than if I simply copied another person's code.
The downside to figuring things out for yourself, however, is the time that it takes for you to figure out the problem. It took me awhile for me to learn how to travel with a wall, use the radar system and using the several events to my advantage, so much so that I didn't have time to formulate a working plan to track a moving target (I believe I have fully finished the rest). The good thing about programming katas, however, is that if you repeatedly practice you will eventually have enough experience to solve problems in a quick and effecient way. And in that I intend to learn from this assignment and tackle things with minimal assistance, since it would make me a better programmer to understand things conceptually instead of just on the surface.
Fast forward to the modern-era and we have not only physical tools but digital tools. Our entire economy is now (for better or for worse) fortified with the backbone of the Internet. The world is ever-shrinking and computers are assisting in computations in everything from business to protein folding. In those terms, we also have games that incorporate productivity skills into entertainment. In terms of contests, battles between algorithms are one of the most popular. Algorithms can mean that a company pulls in more money in the real world or finds a solution the first, but in terms of entertainment it would make for more explosions that result in points. Here is where we introduce Robocode, a game programmed in Java in which we program simulated robots to do our bidding, namely kill the other robots in the field.
Robocode: Robots in Plain View
Robocode was first produced by IBM, but support for the game was later dropped. The code, however, was made open-source and is maintained by hobbyists that I would guess, like the sword-swallowers, have a lot of downtime on their hands. When that downtime and effort results in a game as entertaining as Robocode, however, I don't see a problem with that. Robocode is an extremely easy way of building simulated battle robots that you can then put into a ring together for a battle to the death. All you need to do is download the site freely available from sourceforge run the jar install file and use the provided programs to compile and run your robots. The Robot class provided gives you all the functionality you need to make a competitive robots, namely their three basic functions: moving around the battlefield, scanning for other robots and firing their guns.
Professor Philip Johnson proposes that in order for one to become proficient at the sport of Robocode, as one often does in any other sport, one should submit oneself to learn and practice the basic functions of Robocode in order to increase your proficiency at both programming and the understanding of the game. His form of programming katas incorporated learning how to move to certain parts of the field and how to target/follow certain enemies. I found that learning to do these gave me ideas that I wouldn't have had if I started to kill robots on my own.
For instance, the idea of keeping distance from a robot in one Kata proved difficult since I kept running into walls as I backed up, essentially causing my robot to be trapped by the enemy robot ready to be blown up. My idea for keeping distance, however, by reversing perpendicularly off the wall looked like an effective way to keep distance, even if for a brief time you're going towards the enemy robot, you'll probably be better off in the long run and your gun is still faced at him.
Another challenge I had was figuring out, by myself, how to calculate angles for my robot to travel, not just in x then y, but in diagonals. What I believed would make me a better programmer, and one of the basic functions of robot katas, is to challenge myself and solve the problem myself, without open-source samples. This shows in my code since I have no doubt it's anything but optimal. However, I successfully learned to travel directly to the center using Javas atan function with the difference in coordinates between the robot and the point I wanted to go to. This having come after much trial-and-error, I believe I have a better understanding of the coordinate system than if I simply copied another person's code.
The downside to figuring things out for yourself, however, is the time that it takes for you to figure out the problem. It took me awhile for me to learn how to travel with a wall, use the radar system and using the several events to my advantage, so much so that I didn't have time to formulate a working plan to track a moving target (I believe I have fully finished the rest). The good thing about programming katas, however, is that if you repeatedly practice you will eventually have enough experience to solve problems in a quick and effecient way. And in that I intend to learn from this assignment and tackle things with minimal assistance, since it would make me a better programmer to understand things conceptually instead of just on the surface.
Tuesday, August 30, 2011
FizzBuzz, A Return To Java
The last time I've really coded in Java without learning a concurrent language was ICS 111. Since then I've renounced Java, as alternate dynamic languages I've learned in such classes as 215 and 313 was a lot more appealing to me. Diving back into Java on the first day of 314 was a challenge (though, it was akin to riding a bike, you never really forget it).
Asking to go back to Eclipse was another blast from the past after switching over to Emacs in the past year. The first problem being that in Emacs I use viper-mode, a plugin that emulates the keybindings for vi. To get writing the code, I had to dig into the settings to switch to emacs keybindings, which was a little more familiar to me. Factoring this into the equation I produced the following code in about 10 minutes:
I decided to omit the curly brackets from the code because of code I read in 211 (and from Java In A Nutshell). Learning Python in 215, I liked how compact it looked and so decide to make similar if statements. I also have a bad habit of not documenting my code, as seen here. Having not really produced code for anyone but myself and the TA grading it, I often try to make code the most compact without really explaining what I'm doing. While I could look back at code I wrote, I wouldn't be surprised if a third-party wouldn't be able to understand it. Even if this is a trivial program, a few lines shows one's proficiency and understanding in writing good code.
Asking to go back to Eclipse was another blast from the past after switching over to Emacs in the past year. The first problem being that in Emacs I use viper-mode, a plugin that emulates the keybindings for vi. To get writing the code, I had to dig into the settings to switch to emacs keybindings, which was a little more familiar to me. Factoring this into the equation I produced the following code in about 10 minutes:
I decided to omit the curly brackets from the code because of code I read in 211 (and from Java In A Nutshell). Learning Python in 215, I liked how compact it looked and so decide to make similar if statements. I also have a bad habit of not documenting my code, as seen here. Having not really produced code for anyone but myself and the TA grading it, I often try to make code the most compact without really explaining what I'm doing. While I could look back at code I wrote, I wouldn't be surprised if a third-party wouldn't be able to understand it. Even if this is a trivial program, a few lines shows one's proficiency and understanding in writing good code.
Sunday, August 28, 2011
JFreeChart vs. JChart2D, a tale of two philosophies in Open Source Software
One of the greatest things about software development in the modern world is the amount of free, open-source software made available for anyone with the know-how and determination to use them. With the popularization of cloud storage, easy version control systems like svn, cvs and git, and sites like sourceforge and github, one can easily host projects that are easily accessible and maintained. With the popularization in open software comes a over-saturation in options to choose from. It's with guidelines like that proposed by Philip Johnson's Three Prime Directives, that one can judge a project usable and truly open.
Here I will use the simple JChart2D in comparison with a much more complex JFreeChart to demonstrate the best ways to successfully produce a truly effective open-source program.
The reason I chose graphing programs was my experience with trying to find a suitable graphing program for my needs in ICS 311. Along with deploying an applet, we were required to graph the ratio of Maximum Flow graphs. We had an option to use Third-party software but due to time constraints I was unable to find one that would suit my means. Both JFreeChart and JChart2D would have suited me fine in plotting out the data calculated, and so would've allowed me to pass the class.
JChart2D, though being a less polished and unpopular charting system, has an intuitive, understandable usage page explaining the infrastructure involved and providing code samples for plotting and drawing a simple graph. The time it took from downloading the provided jars to building my own graphs was significantly less than it would take for me to understand and use JFreeChart without the documentation. While JFreeChart provides source code and allows for free usage, provided that you can decipher its infrastructure, JChart2D embodies the spirit of truly free open-source software in that it doesn't require its users to pay for its service.
Both system's source code is readily available through Sourceforge, JFreeChart provided directly from Sourceforge's Download page, and JFreeChart through CVS. Both systems are well-documented, using the Javadoc system to provide descriptions and guidelines for paramaters and return values for each method. The problem with JFreeChart, however, is that its system is never really clearly described (at least, without purchasing the Developer Guide) such that one can't figure out what to change. JFreeChart's source is fragmented, with one section regarding its charting functions and one section regarding its data management, with no clear explanations of its infrastructure (the provided documentation at the root of the data source, for example, reads "The base package for classes that represent various types of data.", with no instructions how to create a dataset for use with a JFreeChart object.
JChart2D on the other hand, has an advantage of having available documentation. One can easily understand the main components of JChart2D because of the simple pictoral representation of its architecture. Granted, its architecture isn't nearly as complex as JFreeChart, due to less functionality, but complexity could be easily solved by good documentation, something that is not available from JFreeChart by default.
In summary, JChart2D, through a comparison with a more popular, yet less intuitive JFreeChart, embodies the benefits of truly free open-source software. It is easily deployable through Sourceforge, provides straightforward explanations of its usage and architecture and invites people to both use it and modify it for free.
Here I will use the simple JChart2D in comparison with a much more complex JFreeChart to demonstrate the best ways to successfully produce a truly effective open-source program.
Could this ...
... be better than this?
Prime Directive 1: The system successfully accomplishes a useful task.
The reason I chose graphing programs was my experience with trying to find a suitable graphing program for my needs in ICS 311. Along with deploying an applet, we were required to graph the ratio of Maximum Flow graphs. We had an option to use Third-party software but due to time constraints I was unable to find one that would suit my means. Both JFreeChart and JChart2D would have suited me fine in plotting out the data calculated, and so would've allowed me to pass the class.
Prime Directive 2: An external user can successfully install and use the system.
Here is where the philosophies of both softwares diverge. From a perspective of a cash-strapped college student, one can only successfully use a system if it's either intuitive to use or there exists free documentation provided to quickly help me achieve my goal. JFreeChart, despite the name, provides insufficient documentation through sourceforge and instead charges $65.00 per client. The system being so popular (4,747 downloads on the week of August 21st), I thought I would be able to use a powerful system that would help my code be presentable and sophisticated. I, however, was left dredging through undocumented source files and demos whose source code I could only purchase.
JChart2D, though being a less polished and unpopular charting system, has an intuitive, understandable usage page explaining the infrastructure involved and providing code samples for plotting and drawing a simple graph. The time it took from downloading the provided jars to building my own graphs was significantly less than it would take for me to understand and use JFreeChart without the documentation. While JFreeChart provides source code and allows for free usage, provided that you can decipher its infrastructure, JChart2D embodies the spirit of truly free open-source software in that it doesn't require its users to pay for its service.
Prime Directive 3: An external developer can successfully understand and enhance the system
Both system's source code is readily available through Sourceforge, JFreeChart provided directly from Sourceforge's Download page, and JFreeChart through CVS. Both systems are well-documented, using the Javadoc system to provide descriptions and guidelines for paramaters and return values for each method. The problem with JFreeChart, however, is that its system is never really clearly described (at least, without purchasing the Developer Guide) such that one can't figure out what to change. JFreeChart's source is fragmented, with one section regarding its charting functions and one section regarding its data management, with no clear explanations of its infrastructure (the provided documentation at the root of the data source, for example, reads "The base package for classes that represent various types of data.", with no instructions how to create a dataset for use with a JFreeChart object.
JChart2D on the other hand, has an advantage of having available documentation. One can easily understand the main components of JChart2D because of the simple pictoral representation of its architecture. Granted, its architecture isn't nearly as complex as JFreeChart, due to less functionality, but complexity could be easily solved by good documentation, something that is not available from JFreeChart by default.
In summary, JChart2D, through a comparison with a more popular, yet less intuitive JFreeChart, embodies the benefits of truly free open-source software. It is easily deployable through Sourceforge, provides straightforward explanations of its usage and architecture and invites people to both use it and modify it for free.
Friday, August 26, 2011
...Aaand we're back.
To say that this week has been rough on the state of my computer would be an understatement. To start off this semester I've made one giant mistake that is often disastrous in both Computer Science and life itself in that I made a last-minute change.
A week before, I had upgraded my laptop's hard drive to a Solid State Disk Hybrid Drive, the Seagate Momentus XT, which combines a small partition of flash memory and a 7200 rpm hard disk. To my dismay, I found that my existing Windows 7 installation on my computer would not work with the new hardware, even if I cloned it and restored it with Clonezilla. With not having a optical disk drive (the trade-off with portability) I had a hard time creating a bootable usb drive with the recovery disk I received with the laptop, only to find out that I needed an install cd with the disk. This is where I gave up recovering my old configurations and started looking for alternatives.
Having found an official Windows 7 SP1 iso online that I could boot with help of Plop Boot Manager, I made a fresh install on the new hard drive only to find out that the Product Key that came with my laptop (and, after a year being spent on the underside of my laptop, was nearly illegible) did not work with an SP1 install. One would wonder how good a Product Key is if it doesn't work with the updates that it's good for. Instead of putting myself through more trouble I decided that since there is software available for installing and making a bootable install on a flash drive via UNetbootin, I'd try for a linux partition in replace of my Windows 7 Install. Instead of the traditional Ubuntu install, I thought I'd stray from what I was familiar with and install Mint Linux LXDE, a distro based on linux that employs both the lightweight manager LXDE and proprietary software like Java and flash by default. While I did like the window manager, I was unfamiliar with the distro's navigation. I could not bring up the file system other than from the menu bar, among other things. The system did not work well with my NVIDIA ION GPU, as well, always resulting in weird artifacts when, say, I'd launch the terminal. The system crashed a fair amount of times, often when taxing the GPU by watching videos, so I decided to install another linux distro I'd heard was good: Debian.
As opposed to Ubuntu, which is sponsored by Canonical, Debian is tied directly to the GNU open-source project and doesn't have a centeral supporter, and instead is developed solely through volunteers. This makes it a great incentive for people to support truly free and pure software. Unfortunately, Debian did not play well with my experience with Linux and with installing a system. Whereas Ubuntu has live CD's with an easy install system, Debian employed the old system of a series of menus with options about the installation. Looking to keep a partition to install Windows 7 later on, I tried my hand at manually configuring the partition (1 Ext3 partition for Debian, 1 Swap partition and 1 NTFS partition for Windows 7). When the installation finished, the morning of the day Assignment 01 for ICS 314 was due, I found that the LILO bootloader did not boot into the Debian install but was dropped down to a busybox shell provided by LILO. No amount of Googling and tinkering made the install bootable and so I missed Assignment 01. As soon as I got home, however, I decided to return to the tried and true Ubuntu install.
As of this blog, I am now running Xubuntu 11.04, an Ubuntu distro that uses the XFCE window manager as opposed to Unity, Ubuntu's new window manager which has received mixed reviews. It is an upgrade from 10.04, which I had installed alongside my Win7 install on my previous hard drive. It seems to be just as good as I remembered it, if not better with significant improvements.
I've learned a lot about the perils of last-minute changes enough from my work as an A/V Operator at the East-West Center. I should have given more time for me to get up to speed with my hardware and software but I was unable to get my configuration on par with my desires in time for Assignment 01. Instead that leaves me with a wake-up call and an incentive to work extra hard in the future to catch up.
A week before, I had upgraded my laptop's hard drive to a Solid State Disk Hybrid Drive, the Seagate Momentus XT, which combines a small partition of flash memory and a 7200 rpm hard disk. To my dismay, I found that my existing Windows 7 installation on my computer would not work with the new hardware, even if I cloned it and restored it with Clonezilla. With not having a optical disk drive (the trade-off with portability) I had a hard time creating a bootable usb drive with the recovery disk I received with the laptop, only to find out that I needed an install cd with the disk. This is where I gave up recovering my old configurations and started looking for alternatives.
Having found an official Windows 7 SP1 iso online that I could boot with help of Plop Boot Manager, I made a fresh install on the new hard drive only to find out that the Product Key that came with my laptop (and, after a year being spent on the underside of my laptop, was nearly illegible) did not work with an SP1 install. One would wonder how good a Product Key is if it doesn't work with the updates that it's good for. Instead of putting myself through more trouble I decided that since there is software available for installing and making a bootable install on a flash drive via UNetbootin, I'd try for a linux partition in replace of my Windows 7 Install. Instead of the traditional Ubuntu install, I thought I'd stray from what I was familiar with and install Mint Linux LXDE, a distro based on linux that employs both the lightweight manager LXDE and proprietary software like Java and flash by default. While I did like the window manager, I was unfamiliar with the distro's navigation. I could not bring up the file system other than from the menu bar, among other things. The system did not work well with my NVIDIA ION GPU, as well, always resulting in weird artifacts when, say, I'd launch the terminal. The system crashed a fair amount of times, often when taxing the GPU by watching videos, so I decided to install another linux distro I'd heard was good: Debian.
As opposed to Ubuntu, which is sponsored by Canonical, Debian is tied directly to the GNU open-source project and doesn't have a centeral supporter, and instead is developed solely through volunteers. This makes it a great incentive for people to support truly free and pure software. Unfortunately, Debian did not play well with my experience with Linux and with installing a system. Whereas Ubuntu has live CD's with an easy install system, Debian employed the old system of a series of menus with options about the installation. Looking to keep a partition to install Windows 7 later on, I tried my hand at manually configuring the partition (1 Ext3 partition for Debian, 1 Swap partition and 1 NTFS partition for Windows 7). When the installation finished, the morning of the day Assignment 01 for ICS 314 was due, I found that the LILO bootloader did not boot into the Debian install but was dropped down to a busybox shell provided by LILO. No amount of Googling and tinkering made the install bootable and so I missed Assignment 01. As soon as I got home, however, I decided to return to the tried and true Ubuntu install.
As of this blog, I am now running Xubuntu 11.04, an Ubuntu distro that uses the XFCE window manager as opposed to Unity, Ubuntu's new window manager which has received mixed reviews. It is an upgrade from 10.04, which I had installed alongside my Win7 install on my previous hard drive. It seems to be just as good as I remembered it, if not better with significant improvements.
I've learned a lot about the perils of last-minute changes enough from my work as an A/V Operator at the East-West Center. I should have given more time for me to get up to speed with my hardware and software but I was unable to get my configuration on par with my desires in time for Assignment 01. Instead that leaves me with a wake-up call and an incentive to work extra hard in the future to catch up.
Subscribe to:
Posts (Atom)