I thought I’d try something new, like batching up things I discover along the week, and append it to a post to be published at the end of the week.
I am pretty certain that it will prove to be more diverse than previous posts, and with summarized points, it might actually be shorter than my “regular” posts.
If you like my longer, but more irregularly scheduled posts, fear not, those will continue, with about the same irregularity as usual ;P
Using this one could first design a site which works with most browsers (I consider MSIE6.0 a lost cause) and then extend the capabilities of the site for those browsers that can handle it.
timetrack and timesummer
I recently started working on a small project aimed to help me keep track of the hours I put into various (other) projects, and the result is two scripts, timetrack and timesummer (I am desperately trying to find a better name for the last one, suggestions welcome).
I promise to have it in a public repository soonish.
timetrack stores current date and time in a “timetrack file” whenever it is called, and at the same time determines if the current invocation will close an ongoing session, or start a new one.
If it is determined that the script is closing the session, it will also ask that I briefly describe what I have been working on. The script then calculates how long the session was and writes this to the file as well along with the brief session summary.
timesummer simply reads the same timetrack file, and sums up the hours from all the sessions, and prints it to STDOUT.
It is multi-user capable-ish, since each file is created and stored in the format “.timetrack.$USER”. All in all it serves me pretty well.
Another project of mine is switch-hosts.sh, a script created to live in /etc/wicd/scripts/postconnect/ and copy /etc/hosts-home or /etc/hosts-not-home into /etc/hosts depending on my location (inside or outside of my home network).
Why I do this is a long-ish kind of story, but if you have ever cloned a mercurial repository from inside a private network and then tried to access it from outside the network, you should be able to figure it out.
The script stopped working. That’s twice now this has happened, but sufficiently far apart that I couldn’t remember why it happened without investigating it.
It all boiled down to me using GET (found in perl-libwww package) to fetch my external IP-address so that I could determine if I am inside my network, or outside it.
GET (and POST and HEAD) doesn’t live in /usr/bin or /usr/local/bin or some place nice like that. No, GET lives in /usr/bin/vendor_perl (or at least it does now, Before a system upgrade it lived somewhere else…
I don’t know why someone (package maintainer, the perl community, whoever…) felt it was necessary to move it (twice now), but I guess they had their reasons, and since I used absolute paths in switch-hosts.sh so that I wouldn’t need to worry about what environment variables had been set when the script was executed, renaming the directory GET lived in meant breakage…
This isn’t me passive aggressively blaming anyone, but it did kind of irk me that the same thing has happened twice now.
Plz to be can haz makingz up of mindz nao, plz, kthxbai.
I love GET and HEAD, and will continue using them, manually. For the script, the obvious solution was to switch to something which by default lives in /usr/bin and doesn’t move, something like… curl.
I have found myself working with PHP again. To my great surprise it is also rather pleasant. I have however found myself in need of a templating system, and I am not in control of the server the project is going to be deployed on, and so cannot move outside the document root.
From what I gather, that disqualifies Smarty, which was my first thought. Then I found Savant, and although I am sure that Savant doesn’t sport nearly all the bells and whistles that Smarty does, for the time being, it seems to be just enough for me.
I am going to enjoy taking it for a spin and see how it will fare.
I do not enjoy bashing well-meaning projects, especially not projects I know I could benefit from myself, but after reading the material on the unhosted site, I remain sceptically unconvinced.
The idea is great, have your data encrypted and stored in a trusted silo controlled by you or someone you trust enough to host it, henceforth called “the storage host”.
The idea is that since everything is executed on the client side, the user can verify that the code isn’t doing anything naughty with your data. Like storing it unencrypted somewhere else to sell to advertisers or the like.
For me personally, this concept rocks! I could use this and feel confident in it. But requiring the end user to the first, last and only line of defense against malicious application providers… (well, of course, the situation right now is at least as bad) isn’t going to fly.
One could experiment with code-signing, and perhaps a browser add-on, and make a “fool-proof” user interface, hiding away the underlying public key cryptography that would be needed, but somewhere along the line the user would still need to know someone who could read the code, could sign it, and then act as a trusted verifier.
Finally, a random assortment of links I found in various places during the week:
The Bun Protocol
Hybrid Core, A WordPress Theme Framework
201 ways to arouse your creativity
Revelation of the week: Thanks to the “Laptop Bubbles” post I realized that I now consider bash (shell scripting) my primary language, thus displacing Python to a second place.