GNOME 2.22 has been officially released with significant new features like GVFS and PolicyKit. GNOME 2.22 will be included in Ubuntu 8.04 and Fedora 9, which are scheduled for release next month.
Year: 2008
Poll-n-Ping, coz u r busy blogging
I would like to introduce a brand new service. It is a automated blog search directory pinging service named Poll-n-Ping. It is different from Ping-o-matic and similar services, because Poll-n-Ping monitors the blog (actually the feed) for changes and when it detects changes it will automatically ping the blog search directories.
You can checkout the service at http://www.mohanjith.net/pnp. All this comes free of charge, but donations are always welcome. Right now there is no limit on the number of blogs that can be monitored by a single user. If you want your blog to be submitted to all the blog search directories that we add support from time to time, you will have to visit Poll-n-Ping regularly.
Soon I plan to add alert service Poll-n-Ping, the subscribed users can receive notification mails or IM when content changes, blog goes offline, and/or blog comes online. However this will be a paid service unless I receive enough donations to support the hosting.
Poll-n-Ping has Turbogears under the hood :-).
Hope you will find the Poll-n-Ping service useful.
Hacking drupal: Add search by node creation date and the author
Some of the users using one of my Drupal sites were asking for search by author and creation date. The site had 3000+ nodes and the user’s request seemed reasonable enough. I first started with googling for a Drupal module or a patch that would add the functionality, but none were found.
So when ahead and hacked the node module. I sucessfully managed to add search by author and node creation date to advanced search block. If you are in searching for a patch like I did you can find it at http://drupal.org/node/233476, I’m keeping my fingers crossed to see whether it would make it to the Drupal trunk :).
Blogger 502 errors
Few minutes ago this same blog, hosted on Blogger started giving 502 Server Error (for more than 15 minutes). I was frustrated and even thought of hosting my blog on one of my servers. I don’t know what caused the issue, but one thing I know this is not the first time and I was not alone; Even http://xooglers.blogspot.com/ was down (giving 502 errors). See http://www.flickr.com/photos/seeminglee/2050618571/in/set-72157603261415176/ for another instance where this issue shot up.
Googling for a cause landed fruitless, my likely guess is blogger servers were overloaded. Hope this doesn’t happen again.
Filter module support for Premium module
There was a need to put a log in link with destination get variable set if the user is not authenticated and trying to view a premium node. The obvious place to put such content is the “Premium body text” in /admin/settings/premium, however one problem was that “Premium body text” can only static html, no filters/format.
I couldn’t quite believe why filter module was not being made use of there, so I went ahead an made the necessary changes to make it possible to select the filter/format to be applied to “Premium body text”. The patch will add a Input format section to the settings form, that filter chosen there will be applied when the “Premium body text” is rendered into a node.
You can see the progress of the patch submitted to drupal.org at http://drupal.org/node/231641. I just hope the patch will make it to the premium module head. The development of the premium module is nearly stagnant :(, that conserns me.
Drupal Atom module spits invalid xml
Drupal Atom module is spitting invalid XML in some cases. It is obvious that all user generated text in XML should be either escaped or appear within CDATA section. However it is not the case with title and subtitle sections. If the site title contains “&”, then the atom/feed will guaranteed to be invalid.
I came across this the hard way, in one of the sites I was maintaining someone decided that they need “&” in the title, then the atom/feed was giving a XML parser error. After little bit of head scratching, I was able to triangulate the buggy piece of code.
You can read the progress of the issue at http://drupal.org/node/229392, you can download the patch from the same.
Hope I saved someone from much head scratching and frustrations.
Hacking TurboGears: Automatically loggin in users
I love the way the Drupal handles account activation and password reset. The user just have to click a link that they receive via e-mail, and they are automatically logged in.
I wanted to do something similar with one of the applications I’m developing right now using TurboGears. I thought I would write a new identity provider, but instead went about hacking TurboGears. I noticed that TurboGears defualt soaprovider can be improved to seperate user authentication and marking a user as authenticated, hence making it reusable.
In my application’s controller I use this newly introduced method to mark the user as authenticated. I thought someone else might hit the same problem, and blogged about it.
You can download the patch from http://www.mohanjith.net/downloads/scripts/python/TurboGears/1.0.4.3/soaprovider.diff, it is created against TurboGears 1.0.4.3.
CAS JDBC Service registry trouble
I had a interesting time trying to figure out why the JA-SIG CAS service registry status was being reset when ever I restarted CAS. After much frustration I figured out the problem was in the schema that hibernate has automatically created.
I’ll explain my setup, I was using MySQL for storing the service registry data via Spring entity manager and Hibernate.
Hibernate was creating BIT(1)
for boolean atributes instead of TINYINT(1)
. Because of this MySQL was not returning anything meaningful for the status. I have now changed the schema, and removed/commented the propery hibernate.hbm2ddl.auto
in the enitity manager bean. It seem to work perfectly.
Hope someone in a similar situation will find this information useful.i
CAS Server 3.2 Final released
Today, the CAS development team announced the CAS Server 3.2 release. The release includes a number of enhancements, bug fixes, and new features. This includes updated dependencies (Spring 2.5.1, Log4j, Acegi Security, Spring LDAP, Spring Web Flow) as well as bug fixes in the SPNEGO module and Services Management tool. It also includes enhancements to enable/disable single sign out at the server level.
Finally, it includes a new Hard Timeout Expiration Policy, an updated Spring Configuration mechanism (and modularized Spring configuration files) as well as a utilizing a production-ready auditing/statistics tool/API (Inspektr).
You can download the release from the usual location: http://www.jasig.org/products
This is a major release and you should take a look at the major new features (the updated Spring Configuration mechanism and the Inspektr auditing tool) and see how/if it changes your deployment.
Great work Scott Battaglia and the others who contributed.
Duplicity chokes on OSError: [Errno 24] Too many open files
It was little bit too scary. Duplicity backup scripts were failing on the EC2 instances again, this time around it was not about not able to reach S3, but having too many files open. That was weird because it didn’t give such a error in the past. However the work around was to increase the maximum number of file descripters allowed for the user that was running the backup script.
How ever finding this solution was tought, actually it was a FreeBSD forum that had the solution. I though I would just write it down for Linux.
Step 1: Find out the current limit
To find out the current file descripter limit for a given use, log in as the particular user and run the following command.
$ ulimit -n
By default on Debian it would be 1024.
Step 2: Increase the limit
You would have to edit /etc/security/limits.conf. You will find details on how to setup different limits in limits.conf itself. The record that you have to put in should look like the following.
username hard nofile 2048
Step 3: Log out and Log back in
You would have to log out and log back in as the user that we updated the file descripter limit. Then run the following command.
$ ulimit -n
You should see the updated file descripter limit.
Hope this helps someone like me in desperation to get the backups in track. I would be doing more investigation as to why there are so many files open. If I find anything interesting I would definitely blog about it. Also for everyone’s reference there is a bug filed at the Savanah bug tracker by someone else who ran into the same issue