My first Google engine application


Yesterday I've created my first application using Google App Engine.

I've been thinking about the possibilities of using a cloud computing environment for quite some time now. Google brings the advantage that you can use their services for free up to a given threshold of resources.

When I first heard about google app engine, it was some years ago and it was only supported on Python. At the time, phyton didn't exactly rock my world so I just walked away and kept on doing other things.

A few months ago, Java support was announced. My wild guess is that Phyton was not exactly a popular choice amongst many developers but Java is a whole different case.

So, I went on with my trial of their (free) service.

The Java environment that is officially supported by Google is based on the Eclipse IDE, since my preferences on coding fall more on the NetBeans side of life, I've "googled" for this alternative.

It turns out to be a quite simple process of configuration.

The first step is downloading and unpacking the Google App Engine SDK to some folder inside your disk - http://code.google.com/intl/en/appengine/downloads.html

No fancy install involved, just unzip and place it somewhere nice.

-------------------

Then, we open NetBeans. Google support is added as a plugin for the NetBeans IDE. There is a very good page that details these steps: http://kenai.com/projects/nbappengine/pages/NBInstall

When configuring the google plugin, I didn't selected the Hibernate Support and didn't seemed to matter much in either case.

After all said and done, you're only missing to add google app engine as a server inside your environment and this is also a simple task.

Click on the "Services" tab (that is next to "Projects" and "Files" on the right side of the IDE) and right click on the "Servers" item.

All that is missing is add a new server - select the "Google App Engine".

---------------

You should now have everything set up.

To start a web project, create a new project and select "Java Web" --> "Web Application". It will create you a template project that will be saying "Hello world!"

When you're ready to publish the project on the cloud computing environment, just right click on the project and choose "Deploy to google app engine".


Don't forget that you need to have a project already configured on the side of the google app website. After this you should be able to visit your URL and try out the results.

You can also view a full video of all these steps at the following location: http://kenai.com/projects/nbappengine/pages/Home

---------

Hope you have fun with cloud computing, this is a really simple and interesting concept at a very affordable cost.

:)

Implementing a customized Hot folder tracking with Java 1.5

One of the features required for the new project is to track changes made on specific folders.

This is the commonly called "hot folder". In the newer java versions this feature already comes implemented using the OS API whenever available.

On my case, being stuck with Java 1.5 there is not other built-in option other than pooling a folder and check for changes.

However, just checking one folder and not going deeper is still kind of limited so we need to stretch this concept further and use the imagination to check other folders without draining away all of the memory available for the java application nor hogging the CPU resources.

This was a nice challenge, used some of the learnings from the Models of Software Systems class with Pedro Bizarro, guess he'd be proud to see a finite state machine diagram being used to describe this concept. It was indeed a good tool to work out the details before moving to the implementation.

I include a small screenshot of the sketch, some changes were made on the final implementation. It's working really fast and indexing all sub-folders at blazing speed with resort to a database to save the precious RAM memory.




Let's move onto the next challenge! :)

Surviving to java.lang.OutOfMemoryError: Java heap space

Memory leaks happen.

You might be a neat and tidy person that tries to ensure that no waste is made since the start of your project coding but leaks happen.

I fell into this trap.

Was feeling nice and comfortable believing that all my code was behaving nice and using a database was keeping the memory consumption low.

Everything was working perfectly when working with a few hundred files.

However, when dealing with little over 50 thousand files the picture turned out to be grey and the evil java.lang.OutOfMemoryError: Java heap space appears.

I tried my best to pinpoint the evil doers but I couldn't really point my finger at any guilty party since none seemed to blame.

Looking for solutions, most people recommend increasing the size of the virtual machine to support the stress. But as mentioned on the opcode website: http://blogs.opcodesolutions.com/roller/java/entry/solve_java_lang_outofmemoryerror_java, adding more memory only hides the problem, doesn't really bring a scalable solution.

That blog post in particular turned out to be very useful. A good approach is using profiling tools that "police" my application and see how it behaves and which parts are not behaving nicely.

NetBeans is remarkable in that sense.

It already comes with a profiling tool that is built-in and quite simple to use.

Just click on "Profile" --> "Main Project" and follow the directions.

It will launch your project and allow you to track what is really going on underneath the hood while the program is running.

Far better than following your program from the external task manager.

I'm attaching a screenshot so that you can see how it looks like:


My application was bursting at 100Mb, with profiling I was able to track down the reason of the leak: the log entries were consuming too many resources.

Disabling the log entries you can see on the screenshot how it goes well above the previous limit using less than 10Mb of memory to index 100 000 files.

Still a lot of things to improve on my code but this nifty tool sure helps to make it possible.

:)



Java: Getting the localized path to the user's Desktop under Windows


Getting the path to the user's desktop under Windows is tricky if you're not using Java 1.6 and above.

Looking for a solution around the web, I see many people recommending to use a wrap around the Win32 API using SHGetSpecialFolderPath or reading the value directly from the host registry system as suggested here: http://stackoverflow.com/questions/1080634/how-to-get-the-desktop-path-in-java

But I'm not very fond of these solutions and found one that is simple and seems to work fairly well although it doesn't seem to be documented (yet).

Just try this snippet under a Windows machine:

FileSystemView filesys = FileSystemView.getFileSystemView();
File[] roots = filesys.getRoots();
filesys.getHomeDirectory()


And you should be able to get the desktop folder as expected. Under my language, this folder is called "Ambiente de trabalho" so it worked like a charm.

This tip comes from Russ Bradberry: http://stackoverflow.com/questions/570401/in-java-under-windows-how-do-i-find-a-redirected-desktop-folder

:)

Reading the file version from Windows EXE and DLL files.



One of my goals for this week was reading the version of Windows executable files. The code should be implemented in Java and avoid resorting to any native calls.

So, a solution that seemed simple and straightforward would be reading directly the PE header from these files and read the version number for our use.

As many things in life, it's easier said than done.

The troubles started with the code required to handle binary files. I'm using Java and had none of my trustworthy code from previous projects to read files using specific x86 sizes for DWORD, WORD and Unicode strings with a limited size.

To my rescue, I've found the nifty binary file class from Jeff Heaton: http://www.heatonresearch.com/articles/22/page2.html

It's simple and perfect. Far better than the code I used in my previous projects. To handle Unicode strings I read each byte of the Unicode string onto a byte buffer and then use a Java command to convert it properly:

Grabbing the byte sequence of the string:

rgbData = new byte[Data];
for (int i = 0; i < rgbData.length; i++){
rgbData[i] = (byte) bin.readByte();
if ((rgbData[i]==0)&&(rgbData[i-1]==0))
break;
}


Converting to plain string:
String output = new String(rgbData, "UTF-8");

--------------------------

But my biggest trouble was the fact that the file version was not kept inside the PE header itself. The file version for DLL, EXE, OCX, DRV, SCR and similar files is kept inside a resource on the file. (Thanks to TheK on boot-land for helping me sort this detail: http://www.boot-land.net/forums/index.php?showtopic=11890)

So, besides implementing the PE header reading part, it was also necessary to implement all the logic to correctly interpret resources inside executables.

For my luck, this format is extensively documented around the Internet and even MS itself has released official documentation that explains (to some extent) how the structures should be read.

Nevertheless, it took me far longer than initially expected. I had planned for a full day of work and ended up working 3 days to achieve this goal. The code itself is not optimized for speed but for the moment it will suffice the needs of the prototype.

I've tested both with DLL's from the Windows kernel and custom executables from other compilers that had inclusively been modified with UPX - the exe compressor.

It was a good adventure. I've learned far more than what I originally knew about the binary format of windows executable files and this added knowledge might certainly open the "window" for other adventures in the future.

:)

Making Java apps look good in Mac OSX

Mac OS comes bundled with Java already installed by default and strives to make the user interface easy for users.

However, for those that like coding Java applications it might seem like a daunting task to make it look user friendly. If we were coding a platform specific application, we'd just select an icon to include at runtime and be done with it.

But in java - it's just too complicated and most pages about this topic that I stumbled upon were not really shedding much light on the matter.

Eventually, I discovered that Mac OS by default also includes some nifty tools to solve this matter.

There is a life-saver application called "Jar Bundler" and you can find it inside the /Developer/Applications/Utilities folder within the Mac OS. I've found it on this page: http://developer.apple.com/mac/library/documentation/Java/Conceptual/Jar_Bundler/Packaging/Packaging.html

--------------

Using this tool is a snap, open it up, select the Jar file that you intend to run and select an icon.

Before clicking on "Create application", you will need to choose a folder that will serve as base for your application. My advice is to pick an empty folder inside the desktop.

-------------

Other tweaks:
You can click on the "Set working directory to inside application package" to ensure that any files that your jar creates are kept inside the package (to keep it nice and tidy).

The icon format used by this tool is .icns but don't worry, you can use the online service at http://iconverticons.com/ to convert the file without any pain. If you want to make your image transparent, just open it on gimp, click on "Colors" --> "Colors to Alpha" and then "Ok". If you save the image as .png then it will be lossless and preserve transparency.

That's all. Hope this tip help others creating good looking apps with Java for OSX.

:)

http://msubuntu.com


Over the last years I've been a sort of doom's day prophet in claiming that MS will eventually acquire Ubuntu sooner or later.

The reason why I claim such statement is mostly because Ubuntu is really good at what they do and the desktop/server editions just get better and better at each 6 months.

Many people think that Microsoft is the all-time enemy of Linux but in reality Microsoft was once the biggest provider of Unix operative systems even before Linux was born.

This was at the time of Xenix, a licensed Unix version that was leased to other companies for deployment in organizations. In many ways, MS contributed to make Unix better and later went to write their own history with MS-DOS and Windows (all genres).


Now, Ubuntu seems to pick on the same characteristics that made windows a platform that everyone could use for their daily work along with any other enterprise tasks.

They're quite different from any of the other Linux flavors in the sense that the focus is given on making a pleasant desktop for users and instead of pleasant users to desktop.

I can't forget the endless times when I needed to edit a pesky xorg.conf to try getting my display to work correctly. In Ubuntu you see no such thing and that can only be a good thing for those who worked with older versions of linux.

Microsoft cannot compete against an operative system that is provided at free of charge for their users but it can certainly acquire the company and profit from a leading position on the linux platforms as Canonical does at the moment.

At the moment that a MS Ubuntu version appeared, in true fact I'd see many organizations adopting linux as their default server configuration just because of the MS label posted on the box.

Inside a big organization, MS knows how to provide outstanding tools but let's face it: many times we need a simple server and paying a costly license is not a good motivator to use MS software.

Maybe this would even become a good way to refresh MS's image from a monolithic empire that is falling down against other rivals such as Apple and Google to an open company that is embracing the future and human innovation.

--------------

So, I'm going one step further with my prophecy and acquired the domain http://msubuntu.com to post my thoughts and hopes for this future to occur one day.

Perhaps more people out there also think this might become a real scenario some day and join me there, would be nice to see this happen.

Crossing my fingers..

:)

yEd - a hidden gem for those who need a simple and free diagram editor

I've been a long time fan of Dia.

Both the Windows and Ubuntu version worked good enough to suffice my diagram needs.

I just turn it on and place all the diagrams into position to later paste them onto any document. Some people like Visio better and I'm sure it is, but my interest was in using a freeware tool that didn't required a license for something that I deemed as very simple.

Well.. there is no Dia for Mac. There is an expensive replacement for Visio on OSX but I'm still interested in good and free solutions.

Looking around the web I've stumbled on a very neat application: yEd.

This graph editor is simply perfect for my needs. Runs in Java and I could start right away for a try without installing it on my machine. The design is very intuitive and thought there are some things that might take one hour or two to get used - it is very easy to use without the need to read any sort of manual.

That's the type of program that I like, free and simple.

This editor can handle UML and a lot of goodies that someone in software development will surely enjoy.

If you're in need for a simple diagram editor, you find it here:
http://www.yworks.com/en/products_yed_about.html

:)

nuno vs the CVS plugin on eclipse


If you're an Eclipse user, you probably already noticed how CVS comes as a plugin already available on the default package.

This is nice and neat to set up a new install and grab your projects without further worries.

On my case, that was the nice type of experience that I had when installing eclipse for the first time on my work laptop, a mac.

Things went smoothly, I got online to grab the documents from the CVS and no worries whatsoever.

However, my nightmare started when after a few days I've started eclipse and the CVS would simply refuse to work for no apparent reason. I know that at this point someone would say: "there is no such thing as no apparent reason" - and in true honesty I agree but have no clues about what changed.

This issue started in last February.

I've tried to solve this annoyance in numerous ways, first by uninstalling eclipse and reinstalling, then I tried the cocoa version of eclipse to see if it work differently and finally resorted to the world wide web.. to no avail on this case.

Using eclipse started to become a real nuisance. Tried asking some opinions to experts but none would come to the rescue, the most common advice: use a PC.

It just seemed that I couldn't get eclipse back to factory defaults: darn mac.

Well.. I'd be happy in using an operative system that would make my life simple but I'm also too stubborn by nature to quit from my intent of solving this trouble.

From February to this part I've reached a stalemate in my quest. I managed to get CVS access from Eclipse but the downside was the fact that I was using a fork from the eclipse project called "EasyEclipse" (http://www.easyeclipse.org)

It was indeed "easy" to use but it was also seriously outdated from the current stable version of Eclipse. This week become problematic to use an older Eclipse version because my team will all be using the stable version to ensure that everyone has the same environment.

Wish I could complain but it was guy who wrote the default environment documentation and there was simply no other way around other than facing my issue with CVS once for good.


And today: rejoice!

I've discovered how to solve this CVS nuisance. A lot of people know how to install plugins from within Eclipse. But what few people probably know is how we can also uninstall them just as well.

And uninstall I did..

I'll share this little secret: Click "Help" -> "Install new software" -> "Already installed" -> select "Eclipse CVS client" -> click "Uninstall"


After this, just restart eclipse and install CVS again from the default eclipse repositories.

So simple to fix and so many headaches to find the solution...

:)



virusremoval.pro begins

Yesterday I've announced the virus removal community forum across the good people of boot land.

The feedback response was surprisingly good. The site had been started little over four weeks ago and counted with an average of 100 daily visitors, yesterday we've peaked at over 1200 unique visitors.


This kind of response is truly motivating. The quality level of the ongoing discussions is also very good, there is a lot to be learned and we do have the conditions to give malware authors a really bad headache.

:)

Email robustness


Ever since I've first got my own personal domain as http://nunobrito.eu some years ago it became possible to use my official email address as mail@nunobrito.eu

Instead of having several mailboxes I've decided to concentrate them all at my gmail account using email redirection.

It's simple, worked well.

However, the server where my domain is hosted would often become offline for several days in a row due to the sheer amount of people trying to access the other services hosted there.

This brought me for the first time to a fragile situation because I would no longer be receiving emails while the server remained offline.

I've devised a manner to provide email robustness at no extra costs and without requiring a server.

It's very simple.

- The first step is registering your domain with goddady. If you've already registered with their service then it's fine, otherwise I'd seriously recommend moving to this particular registrar.

- Then register for google apps standard edition: http://www.google.com/apps/intl/en/group/index.html
Input your domain as the domain that will host the apps, don't worry because this won't mix with your site in case you already have it developed.

- After registering your domain with google apps, look on this page regarding how to configure the mx records at godaddy: http://www.google.com/support/a/bin/answer.py?hl=en&answer=33353

But my favorite part is this automatic tool to do it quickly: https://www.godaddy.com/gdshop/google/gmail_login.asp

The above link will automatically configure your domain to use gmail as the mail service.

- At this point the email is handled by google apps, let's manage your site:
https://www.google.com/a/cpanel/example.com (change example.com with your domain)

- After login, click on "Email"

- Click on "Change URL" and select the URL that is hosted by google so that you don't need to use your own domain to log onto the webmail service.

- Click "Email addresses" and then add a user.

- Select a user listed there and add "nicknames" to the user. This means the possible email addresses that will be used to route the messages.

- Log onto the webmail service and set the routing of all messages onto anywhere you want, just like you would do with a normal gmail account.

- Done!

You are now routing all your messages sent to your official email address onto anywhere else you need.

Good luck.

--------------------

Now my official email is robust and no longer depending on any server.

:)

Back to my roots


This is my final week in Pittsburgh. I've completed the second semester just like all other students from the MSE program.

Pittsburgh was really nice but now it's time to return onto Portugal where the remaining two semesters will last until December.

I should be happy now. The most difficult times should have passed and things from here forward should actually be more fun: I'll be coding and creating the studio project that we have been planning for so such a long time now.

However, I'm not happy. Returning to Portugal means a direct confrontation with the army in my country. From my relatives I can only gather some whispers and rumors. Even my lawyer is mute as the stormy days approach the road ahead.

Upon the idea of returning to my roots I feel similar to a small kid returning home after doing something wrong and afraid of a reprimand from his father.

And in fact, the army is still like family to me. They helped me as a teenager when the future was not promising, providing the tools, resources and discipline necessary to survive.



Now it's time to discover the world outside my "adopted family" even if it means facing the anger or disappointment of those that expected me to continue in the same career.

But this path is not easy. Usually I don't get tired, I might get exhaust and need to sleep for a good night or even get a good coffee to start a fresh day, but right now I'm just getting tired of all these obstacles.

Guess I need to remind myself that we all have a purpose and I'm still defending what I believe to be correct.

I might not be happy about the future but I'm certainly happy to have reached this far in my goals for life.

Even if I could go back to the past and be faced with the same decisions, I'd still do them. Wouldn't trade the knowledge, people and memories for anything less than a chance of truly start living with freedom of choice.

And having a choice really makes me smile.

:)

Can our legacy outlive us?


Thought this is sort of a tabu matter, no software developer or webmaster likes to wonder about what will happen when they are no longer around to assure the survival of their legacy works.

Unfortunately, I'm already forced to think about these matters due to certain attributes and choices of my life that might shorten my available time span to take care of them.



Over the years, I've done software and helped raising websites with people that now depend on my direct support to keep things going on daily basis.

Every now and then a new situation appears that needs to be solved. Today one of the subdomains was reported as hosting a malicious page, meaning that someone from the outside managed to break our perimeter defenses and use our server for their malicious purposes.

Our server was automatically shutdown by the hosting provider until I could talk with them and remove the malicious page.

I love keeping things tidy and running well but I'm certain that when I'm no longer around to assure this type of service, few other guys will be available or willing to carry forward the needed support to provide the things that we achieved over the years.


It would be unfair to so many that trusted in us as guardians of their knowledge to fail in this task. Websites like boot-land.net and projects for winbuilder deserve to endure long after the initial authors and patricians are long gone.

My own memories, hosted at nunobrito.eu are also depending on my own administration to be available.

This is clearly a solution that cannot last for centuries to come. One idea would be moving to the platform of a bigger provider like google, yahoo, microsoft, flickr or facebook.

But none of them work will likely work like we need. Anything hosted on their platforms is outside our own management and can be lost at any given moment.

How to solve this?

Thought a complete automation and debugging of issues that occur is never possible to achieve, it is possible to prepare the path and mechanisms to allow others to follow our steps and also ensure that this process is simple to keep things moving.

--------------------

The first worries are costs that are divided between hosting costs and domain renewals.

To support these costs, the publicity gathered from the bigger sites is currently enough to keep the site self-sufficient. Revenue from publicity is directly deposited on a specific paypal account and is driven from two separate advertising channels to ensure redundancy.


The first step to solve the matter of domain renewal is concentrating all domains on the same registrar (I've chosen godaddy as they are the leading provider of this service). This is not as simple as it seems, the most important domains such as winbuilder.net and boot-land.net are hosted on dominios.pt that are aggressive about domain transfers. Will try to solve this.


The second step is automating the renewal process. Currently, as paypal is the standard web payment tool, I can automate the automatic withdraw of funds to keep registrars happy.

Once the domain renewals are solved I will focus on the hosting plans. Currently we have two servers. One of them is directly sponsored by R1Soft. Unfortunately I'm unable to get in contact with them to transfer the sponsoring to the second server that has far better conditions than the first.

At the current state we are having a server that is expensive (110 USD/month) and barely used along with the new server that costs around 140 USD/month.

The most problematic situation is the new server. 140 USD/month cannot be payed from paypal. This bill needs to be settled with a credit card or direct bank transfer from an account located in Germany. The problem is that credit cards have a limit date on years that are valid. This is not a solution that will last for long since we'll always be vulnerable to a lack of payments on the bank account that is used for this purpose.

Contacting the hosting company to support paypal as revealed itself a fruitless effort. They are afraid of paypal and will only keep the traditional payments.

There's a saying in Portugal: "If maome doesn't go to the mountain then the mountain will go to maome". Guess the next logical step is opening a bank account in Germany. Being an european citizen and having good friends in Germany it shouldn't be too difficult thought I'm not yet sure if it is possible.

From this point it would be needed to ensure that paypal transfers enough revenue onto the german account at each month.

One thing is certain, without assuring hosting bills to be paid flawlessly our projects will be vulnerable to extinction.

--------------------------------------

Assuring a good flow between expenses and revenues is essential. But unfortunately this isn't something that a software engineer can code as a program to be compiled and executed.

We will always need a human operator to ensure that things keep running even when something unexpected occurs. So, the challenge is preparing the management of our financial balance to someone of trust. But how can we asses if someone is truly trustworthy for this task?

Too many open variables at this time to properly solve this challenge, guess we'll need to use a credit card for the next 2~3 years and renew this value at each year until a new hosting provider is found or otherwise really open a german account to get this settled.

----------------------------


The second point is server administration. How and where can we find people capable of managing our servers?

The technology is fairly standard but none of the servers follows traditional implementations, they were customized for speed and performance when subdued to heavy loads or special behaviors by winbuilder projects.

I know how to get things working, how can others learn to do the same?

It would be nice to completely outsource winbuilder projects to run from other service providers and ease our administration tasks but who would be willing to support terabytes of demand per month and providing gigabytes of space on their disks?


--------------------------------


Things are not easy for us but none of the folks at boot-land quit when faced with challenges nor will I quit from finding a decent solution while I'm still here.



When considering all the adversities required to ensure that our legacy can outlive us in the Internet, most people would say that we are simply outnumbered.



And perhaps they are right. The odds of ensuring that so many different services with such complex characteristics survive the endurance of time is simply nuts.

But having strong odds against this endeavor doesn't make it impossible to achieve. With a good strategy we might just be able to use our own resources to reach the expected results.


There is no way of supporting the costs of renting the services from a cloud computing hosting or hiring specialized support personnel. Given all these constraints we might as well dive into building our own robust cloud and administration solution to allow our legacy survive in years to come.

We wouldn't just be reinventing the wheel, we'd be redefining a whole new vehicle to take us where we need to go.

Perhaps this is indeed nuts but a worthy challenge as nobody else seems (yet) be considering this type of issue and we do have a tradition of moving ahead of the game..

:)

Are we romans?


For a long time, Europe lived under the rules of the Roman republic and later, the Roman empire.

Throughout this time, roads were built and civilization was brought to places where the people previously had habits not too distant from those on the ice age.

Roman civilization meant education, health and wealth to those who embraced roman culture as a way of exchanging goods and traveling in security to other provinces.

However, the West Roman empire where France, Germany, Portugal, Spain and Italy exist today was progressively torn apart across the centuries by successive invasions of barbarians from the Northern parts of Europe until the roman empire in the west ceased to exist, marking the start of the medieval ages as we know them.


This remark in history is a true contrasense. If at one hand we had such an advanced culture in terms of technology, culture and organization - what went wrong to let the barbarians took over?

The explanation is plain brute force from the barbarians and faulty roman policies in terms of administration.

We see the roman history filled with heroes that shaped the republic and empire to success and falls across the ages, yet, even the barbarians admired the roman culture and attempted to replicate the same level of culture to no avail most times.


What can we learn as a lesson of history?

Civilized cultures are not prepared enough to survive against brute force threats from the outside or even weak management policies.

We see the same issues plaguing computer systems nowadays. Computer users attempt to secure their own machines from outside threats but are unable (or unwilling) to keep them tidy with effective security policies.

The barbarians are today's malware authors that use conquered computers as slaves to power monstrously sized botnets that serve as weapons to attack companies and organizations across the globe.

This is the reality today. We have barbarians and we have civilized citizens trying to work. But we still have little means to fight back the threats and propose truly efficient methods to fight them back, instead, we rely in defense and few actions are taken to go after the wrong doers.

Looking to where we are headed, will we let history repeat itself with the fall of a civilized Internet?

I think we should learn from the lessons of the past that we cannot close our eyes to the enemies at the border and expect that our own organization will be able of solving these menaces some day.

So, I would like to propose a roman-like concept of cooperative defense for the internet world.

Back in the old days we would see farmers picking up anything at their hand to defend their lands. This reality is not far from today where computer users install antivirus at their machines to keep them protected.

However, would it sound reasonable for you that a farmer is truly capable of defending himself against an aggressive tribe of barbarians that are professional hackers?

Something is truly wrong in this picture. Over the next months I will dedicate my effort to prove that we can indeed be romans. Prove that civilized people can fight back the attacks from barbarians and level the weights on each side of fence.

Let's begin.

Finishing up the second semester

It's been 4 months since I've arrived to the States and now it's time to go.

Pittbusrgh was very nice to me. The people, the university and the town were pleasant and welcomed me inside their traditions and culture, making me feel as part of this place instead of an outsider.

I will soon finish the second semester and head back to Portugal but things won't be easy there.

From an educational perspective I'll be working to implement our software engineering studio project and will also be doing an independent study proposal on a topic that was self-proposed.

The tough part for me will be finding a place to stay, where to study and how to manage my situation with the Portuguese army alongside with balancing my financial status to ensure that I can support all these expenses.

But things are moving forward and a lot of people are helping. I can't predict what will happen next but at least I'm happy to still be studying and getting closer to complete another step in the MSE challenge.

:)

Link exchange with MSFN

MSFN is one of the oldest and most respected sites when it comes to Microsoft discussions. It was created in 2002 with the purpose of sharing knowledge regarding how to tweak windows to work how fans intended.

Over the years it has specialized on the customization of unattended windows installations, some of the most popular tools that spawn from this forum are nLite and vLite that reached a global scale success.

Boot Land started in 2006 when MSFN was already a giant in the Windows arena. Over the years we have also grown to the point where we have become giants in our own domain.

MSFN and Boot Land have mutually exchanged links. To date, we are the only forums recommended on MSFN and we also come a long way to deserve such attention.

My personal thanks to everyone who supports the boot land community!

Virus Removal Pro

I've started a new project last week, it's a new community forum entitled "Virus Removal" that can be found at http://virusremoval.pro

Boot Land was and remains a success in terms of a community focused on the development of boot disks but I've been feeling that it is also time to expand to other themes that are also important and interesting to explore.

Funny, but I've first started creating boot disks to repair my computer after it was attacked by a virus. So, in a good sense I can say that both activities have also been part of my routine ever since I was a kid.

This new community has a lot of good things.

I've used myBB as forum software, the strong point is that it won't require any costs with licenses since it is free, also, I can add plugins in a really easy manner that doesn't require editing any file. In overall it is a very simple and straightforward software to use.

The domain itself is very easy to remember and straight to the point. Our forums at Boot Land are also being used to help kickstart the forum and bring new visitors.


Last but not least, virusremoval.pro already counts with the participation of valuable experts in this field.

-------

So, it seems we have all the right ingredients to start a good community that can actually contribute with something fresh and useful to people around the Internet.

:)

http://freeflippa.com donated to the owner of http://flippa.com


Last year I started a project meant to provide a free alternative to the services provided by Flippa, a website dedicated to "website flipping", the art of selling websites that were just created.

I've chosen the domain http://freeflippa.com and shortly after opening the doors for this new site, the owner of flippa.com approached me by email with concerns about trademark assurance.

Following the european laws, nothing would restrict me from using this specific website domain since it is not a trademark recognized in Europe or the States.

Nevertheless, I'm not the type of person that enjoys "cybersquatting" and removed any resemblances on the site that would compete against the paid services provided by flippa.com

Two days ago I've decided to clean up my portfolio of domains and remembered that I still had freeflippa. Decided to send an email to the owner of flippa and I've transferred freeflippa.com to his ownership at zero cost.

Webmaster projects are very attractive to some extent but I guess that I'm just not cut for them.

Will rather stick with software development, it was nevertheless another episode on my life to keep in memory.

:)

New server!

A new server was added to power up our sites.

The new machine is a brand new i7 server with 8 CPU cores and a whooping RAM of 24Gb running under Ubuntu x64.

The configuration was a bit more problematic than expected. The older server had all domains managed with Plesk but this platform was not supported on the new hosting and I simply couldn't afford the added cost for a plesk license at each month.

So, I've decided to try out the free alternatives to cPanel or Plesk. My attention was first tuned onto webmin and virtualmin since they allow to manage most aspects from a server quite easily.

However, the management of virtual hosts under Apache soon turned out to be an headache to solve. Some bugs occurred and then it became very difficult to get things working as expected.

Disappointed with Apache, I've turned my attention to Nginx as a web server. However, the lack of documentation and clear examples also led me to fail using it as expected.

Last but not least, came lighttpd. It's reputation for performance is well know and I decided to give it a shot. It turns out that this server has reached a very mature and stable level of performance and was only an apt-get away from being installed on the machine.

Out of the box came the support for PHP just like I needed. Adding a new virtual host is a task as simple as adding a new line on the configuration file and restarting the server.

As for speed, boot-land.net is now loading at a blazing fast speed. Lighttpd is serving static files at a really great speed on a far faster server than ever before and I'm happy to see it working well.

I still need to configure some other details. Luckily I didn't had any sites requiring mod_rewrite



Attached is a screenshot of the new server in action, feels really good to see all works being distributed amongst the 8 cores with plenty of RAM to go about.

The monthly costs for this server are really high, especially if we take into consideration that the older server was fully sponsored by R1soft for free and that now I'm supporting the new hosting costs on my own.

But I was unable to get in contact with them and the boot land sites were in dire need for better conditions.

Let the fun begin.

:)

Bring on the bots..

We've been having a lot of fun over the past two weeks in our boot land server.

It's been a long while since the last time we got offline for such a long period of time (2 years?), at the time was required for overcoming the attack that we move from a virtual shared server onto a fully dedicated server.

At the time, our growth just seemed to keep going up unexpectedly and suddenly we were having thousands of visitors from all around the world looking for boot disk solutions.

This was two years ago.

Currently.. boot land grown onto thousands of daily visitors and terabytes of monthly information being exchanged from the server.

We've been happy and busy working on the new things that appeared ever since. However, there is still one situation that is challenging our capacity to survive the test of time: bots.

Given the current position of boot land as a popular site, we are experiencing a mass wave of bot machines that visit our site and associated domains (winbuilder.net et al), causing our resources to be quickly depleted into what people would call as a DDoS. Typically, this is a obviously a good way to ensure that a competing site is sent for oblivion.

And a strong DDoS it was indeed. The server was not prepared and a massive black out got installed for 5 days in a row.

But.. we're alive. Sometimes slow at serving pages, it's true, but we're winning back our server to the bots that were sent our way.

So, if for any chances the person(s) behind this recent bot incursion wind up reading this blog post, do note one thing in particular about boot land: we're here to stay, bring on the bots.