My hosting solution provides unlimited diskspace. It used to be limited but large (like 600 GB or something) but even that has been lifted. However, there was one problem. The number of inodes is supposed to be no more than 250K. That’s an awful lot if you think about it since each file and directory consume an inode (I think). I host several domains with my hosting plan that also supports unlimited domains. One of my domains happened to require lots and lots of files. It was actually because I was pre-generating static html pages rather than generating them on the fly ondemand. My hosting provider is a decent one. I once accidentally ended up having a cgi script that went into infinite loop and I just got a phone call and an email that the script has been terminated. I fixed the issue and after filing a ticket they made the script executable again. For the inode limit, as such the limit of 250K is not a hard limit. It’s ok to exceed (and I did end up at 255K+) but apparently the backup process will not pickup all the files.
I had been serving the pre-generated static html files using a cgi script. So, I got rid of the inode problem by bundling all files in each folder into a separate tar.gz (compressed archive) file and then at runtime extract the required file. This reduced the inodes significantly. Ofcourse, this increases the runtime overhead as the file has to be extracted, but even though I have so many files, the traffic to the site is very little and so this extra overhead isn’t such an issue.
Besides using a compressed archive and a regular file system I would like to know if there are any other efficient alternatives for storing and accessing large chunks of text.
I used to have a pagerank of 5 for this blog. After more than 2yrs and several software related posts, I got to the 5 pagerank and about 250 page views a day level. Then, during the last pagerank update, I went down from 5 to 4. Which I didn’t bother, as you know it fluctuates, like the stock market. And I know the stock market has fluctuated a lot more recently, mostly going down. But that can’t be the reason for my blog to lose the pagerank. Isn’t it? I mean, like not even a 0! So, what happened to my pagerank bar Google dude?
Recently I have developed a website that had user generated data growing steadily. The database used is sqlite3. Before putting it up on the internet, I made sure to verify that all the pages are performing reasonably well (sub-second response times). Then, as time passed, one of the pages started slowing down. And with each passing day, it was getting bad. Then, after searching the web for some performance tips for SQLite, I noticed that the order of the tables matter! Yeah, coming from a Oracle CBO (Cost Based Optimizer) background, I wasn’t expecting that I had to do that, but apparently yes, the order matters. So, after changing the order, the reports are now back to sub-second response time. What’s better is, given that my reports have a time interval, no matter how much the underlying tables grow, the report is likely to take relatively constant time.
So, make sure that in the from clause, first place the table or sub-query that is very selective and then the rest of the tables. The rest of the tables are typically used to resolve a few foreign key references of the main table and usually not selective. For example, if userid is a foreign key and you need to get the username but the report is not selective by users, then it’s better to place the users table after the transaction table that gets filtered by the time interval or other parameters.
Long back, websites were optimized for 1024×760 displays when the screen resolutions weren’t as good as it is these days. Then with larger screens both for desktops and laptops, that was no longer an issue and so there are many websites that use much more wider layout. But this holiday season (check the popular laptops sold on Amazon) and Microsoft’s recent quarter results confirmed that Netbooks are going to stay and people will use it to do a lot of casual web surfing. And most of these netbooks have only 1024×760 resolution. It may be possible to increase the dots per inch and provider higher resolution, but the physical size of these sub-notebooks or mini-notebooks or netbooks or whatever you call it, is not going to make it easy for the eye to squeeze in higher resolutions. So, may be it’s time to start optimizing the websites for 1024×760 resolution again.
We all know the economy is bad and we are in a recession for more than a year, but came to know about it just now, after the elections (almost as a conspiracy). And with so much meltdown there is an outcry on the CEOs pay, their way of reaching Washington for a bailout (flying corporate jets vs driving hybrid cars).
When discussing this CEO pay issue with friends, I thought that the reason that it will continue to be high and nothing can be done is because the stock holders, a major portion of them who are institutional investors like mutual and hedge fund managers probably don’t bother to care about restricting the CEOs pay because it’s going to come back and bite them when they take their hefty bonuses for the fund performance during the good years (and probably bad years as well which really upsets the common public myself whose retirement savings plummeted by half. But apparently, the real reason is that the shareholders in the US don’t have a say on the executive compensation like it is in the European countries where the ratio of salaries of rank and file employees to their executives is much lower than in the US. Hence, the “say on pay” proposition. Need to see where it goes.
Anyway, after following all this, when I recently read an article on how the companies are cutting their IT spend by adopting more and more open source, it got me thinking about the Open Source and the CIO pays.
Here is the thing. If you are a business owner and you find that you are paying 10 times more for the personnel to maintain the software than the licensing cost of the software, how would you feel? I mean, if a medium to large company uses a software like Oracle, SAP or Microsoft’s enterprise software, the cost of the software license and support is much more than the salaries of the people to manage those applications. But if the IT guys start using open source software and reduce one part of the equation, what happens to the other part? May be that’s when the CEOs will realize the extra overhead and move to SAAS and keep the IT staff to none or minimal?
Let me see, if a big company starts using SAAS for all of it’s IT requirements including email, collaboration and enterprise applications (ERP, CRM, HCM and Financials), then does it need a CIO and pay him a heft $250K salary or more?
Desperate times need desperate measures. As some political person said during the bailout discussions “the party is over”. The good for nothing and old style management should stop justifying their very existence and start embracing the new ways of doing business. It’s sometimes sickening to see a handful of people doing real work and more than double that number just introducing process after process, that seldom really works, to justify their job roles and compensation.