Posts Tagged ‘cli’

Introducing passtore

Sunday, May 8th, 2011

First of all, I find it prudent to insert a HUGE disclaimer:

I have no formal education within the field of IT security, and there may, unbeknownst to me, be millions of ways to circumvent the security this suite offers.

Naturally I have tried to make it as safe as I can since I am using it myself, that said, I offer no guarantees that a determined aggressor couldn’t make short work of the protection offered.

If you know that there are threats aimed at you, you should probably also know that this software is not for you.

This is meant to be used by ordinary people like myself, who’d just like to improve the security of their various accounts and services by using unique, and probably longer and stronger, passwords for each and every service they subscribe or otherwise have access to.

passtore has worked well for me over the last 6+ months I have been using it, but mind you, to the best of my knowledge there are no determined efforts by an aggressor to compromise my security.

Behind the scenes passtore uses GPG to store passwords in a file ~/.gnupg/passwords.gpg, and optionally depends on xclip (for copying a password to the clipboard) and pwgen (for generating strong (long and full of entropy) random (well, as random as a deterministic system can make them) passwords).

As it is a CLI-based suite, it is also rather easily scriptable (not to the point of allowing full automation, the user will need to input the GPG privkey passphrase, but it has been successfully been plugged into other applications such as mutt, msmtp and offlineimap.

There are a couple of gotchas that one needs to be aware of for a moderately safe operation of these scripts:

  • The protection offered is not stronger than the strength of the passphrase securing your GPG private key
  • If the aggressor gets hold of ~/.gnupg/passwords.gpg and your GPG private key s/he could potentially brute-force it open offline in their own good time
  • If the aggressor can modify the scripts ({add,get,mod,del}pass) or the ~/.passtorerc s/he can compromise your security
  • If the user could modify your ~/.gnupg/passwords.gpg file, s/he can lock you out of all the places with passwords protected by passtore
  • If the aggressor could modify your ~/.passtorerc file, s/he could add another (unauthorized) recipient to the ~/.gnupg/passwords.gpg file
  • If the optional dependency xclip is used (getpass -c <host>) the password will be stored in the X clipboard until overwritten by something else
  • While unencrupted in the clipboard, there is a minute risk that swapping occurs, pushing the password onto the swap space; passtore does not perform any sort of harddisk or RAM scrubbing
  • If you forget the passphrase for your GPG private key, you won’t be able to unlock the ~/.gnupg/passwords.gpg file… ever
  • If either your GPG private key, or the ~/.gnupg/passwords.gpg file is corrupted, you are truly out of luck
  • Some services will seem to accept a long, special-charactered password, up until after you have actually changed it, and try to login, at which point you are locked out; morale of the story? MAKE SURE THAT THE EMAIL ADDRESS YOU PROVIDED IS A REAL ONE SO YOU CAN RESET THE PASSWORD!

Most of these issues can be handled by common sense and sane file permissions (0700 for the scripts, 0600 for the files), and also to not allow untrusted people onto your account.

Nevertheless, security is a hard topic to get right, so please do not use this software if your life could depend upon the correct and secure operation of it.

My previous way of handling passwords were thinking up a “base password” which I then modified slightly for each and every service.

Think along these lines: if “pizza” was my base password, “hotpizza” would be my hotmail password, while “goopizza” would be my google password. (In reality I used a longer base password than that.)

The primary problem with this was that if someone ever were to learn of the base password, they’d have the keys to my kingdom.

Since I am not in the business of divulging that sort of thing to anyone, you might incorrectly think that this is a safe way of doing it. You’d be wrong.

What would happen if I had been lured into signing up for an account with a new service which seemed legit, but which in reality was nothing more than a honeypot for username, email addresses and passwords?

Do you use different usernames on different services? Most of us don’t, and there may even be some value in not doing it (recognition/reputation of sorts from other services).

So even with my previous password system (it would of course have been a total bust if I used the same password everywhere) an aggressor could have figured out how to reverse engineer the base password and reconstruct it for other services.

Of course, given the amount of people who just use the same password everywhere, I don’t think they’d have bothered with my password at all, unless they were specifically targeting me, which is wholly unlikely as well.

But with passtore, I don’t even need to care or worry. If the site admin is a sleazebag, or incompetent/unlucky enough to have the database stolen by aggressors, or a “friend” tries to compromise an account, that’s as far as they’ll come.

Obtaining one password for one service gives them control over that service, nothing more (with the one obvious exception; if someone were to gain access to my email account password, they could reset the password on every service registered with that email address).

Be paranoid about your email passwords people! It is unfathomable to me how easily people hand over their usernames and passwords to their email accounts to sites like LinkedIn and Facebook.

Sure, they are “only” scanning your contacts for already present friends and any service that went beyond that would very quickly be found out and get a bad rep, and in all probability criminal charges brought up against them.

With that said, who knows if Facebook or LinkedIn, or any of the other social media sites out there who want you to divulge your email password to them in the name of contact building, stores you password, and if so for how long, and for what purpose.

passtore will let me use different passwords for different services, without making it hard on my memory. In doing so, it mitigates the effects it will have on my life if a single service is compromised.

passtore will keep my passwords safe from nosy siblings, friends and partners, and, depending on the strength of my GPG privkey passphrase, it would keep them safe from most determined aggressors as well.

Could Google bruteforce their way in? Probably.
A government funded agency? Definitely.

As I am not facing that type of opposition, and the only threat to me is to inadvertently entrust a service with a password, which the service providers may try to abuse, passtore works well for me.

The usual disclaimers apply, I assume no responsibility for any damages you might incur, if you lock up a whole host of passwords and have either your passwords.gpg file or your GPG private key corrupted, that is truly unfortunate, but I designed it to be as secure as I could make it. It is not meant to be recoverable or decryptable without these files, so please make sure that you have backups of them somewhere safe.

Again, be smart, be safe, and use it at your own risk.

passtore on


Sunday, May 8th, 2011

META section
I thought I’d try something new, like batching up things I discover along the week, and append it to a post to be published at the end of the week.

I am pretty certain that it will prove to be more diverse than previous posts, and with summarized points, it might actually be shorter than my “regular” posts.

If you like my longer, but more irregularly scheduled posts, fear not, those will continue, with about the same irregularity as usual ;P

Content section


Modernizr is a javascript library designed to detect what html5 capabilities a visiting browser has. This enables a kind of “progressive enhancement” which I find very appealing.

Using this one could first design a site which works with most browsers (I consider MSIE6.0 a lost cause) and then extend the capabilities of the site for those browsers that can handle it.


timetrack and timesummer

I recently started working on a small project aimed to help me keep track of the hours I put into various (other) projects, and the result is two scripts, timetrack and timesummer (I am desperately trying to find a better name for the last one, suggestions welcome). I promise to have it in a public repository soonish. timetrack can now be found at bitbucket

timetrack stores current date and time in a “timetrack file” whenever it is called, and at the same time determines if the current invocation will close an ongoing session, or start a new one.

If it is determined that the script is closing the session, it will also ask that I briefly describe what I have been working on.  The script then calculates how long the session was and writes this to the file as well along with the brief session summary.

timesummer simply reads the same timetrack file, and sums up the hours from all the sessions, and prints it to STDOUT.

It is multi-user capable-ish, since each file is created and stored in the format “.timetrack.$USER”. All in all it serves me pretty well.

Another project of mine is, a script created to live in /etc/wicd/scripts/postconnect/ and copy /etc/hosts-home or /etc/hosts-not-home into /etc/hosts depending on my location (inside or outside of my home network).

Why I do this is a long-ish kind of story, but if you have ever cloned a mercurial repository from inside a private network and then tried to access it from outside the network, you should be able to figure it out.

The script stopped working. That’s twice now this has happened, but sufficiently far apart that I couldn’t remember why it happened without investigating it.

It all boiled down to me using GET (found in perl-libwww package) to fetch my external IP-address so that I could determine if I am inside my network, or outside it.

GET (and POST and HEAD) doesn’t live in /usr/bin or /usr/local/bin or some place nice like that. No, GET lives in /usr/bin/vendor_perl (or at least it does now, Before a system upgrade it lived somewhere else…

I don’t know why someone (package maintainer, the perl community, whoever…) felt it was necessary to move it (twice now), but I guess they had their reasons, and since I used absolute paths in so that I wouldn’t need to worry about what environment variables had been set when the script was executed, renaming the directory GET lived in meant breakage…

This isn’t me passive aggressively blaming anyone, but it did kind of irk me that the same thing has happened twice now.

Plz to be can haz makingz up of mindz nao, plz, kthxbai.

I love GET and HEAD, and will continue using them, manually. For the script, the obvious solution was to switch to something which by default lives in /usr/bin and doesn’t move, something like… curl.



I have found myself working with PHP again. To my great surprise it is also rather pleasant. I have however found myself in need of a templating system, and I am not in control of the server the project is going to be deployed on, and so cannot move outside the document root.

From what I gather, that disqualifies Smarty, which was my first thought. Then I found Savant, and although I am sure that Savant doesn’t sport nearly all the bells and whistles that Smarty does, for the time being, it seems to be just enough for me.

I am going to enjoy taking it for a spin and see how it will fare.



I do not enjoy bashing well-meaning projects, especially not projects I know I could benefit from myself, but after reading the material on the unhosted site, I remain sceptically unconvinced.

The idea is great, have your data encrypted and stored in a trusted silo controlled by you or someone you trust enough to host it, henceforth called “the storage host”.

Then an “application host” provides javascripts which in turn requests access to your data, which you either grant, and then the application code does something for you, and you see that it is good, and all is well, or you don’t grant access and you go on your merry way.

The idea is that since everything is executed on the client side, the user can verify that the code isn’t doing anything naughty with your data. Like storing it unencrypted somewhere else to sell to advertisers or the like.

For me, this premise is sound, because I am a developer, a code monkey. I can (with time) decipher what most javascripts do.

Problem: the majority of people aren’t developers (well that is not a problem, they shouldn’t have to be), but what I’m saying is that of all people only a subset knows that there exist a language called javascript, and it is only a subset of that subset which can actually read javascript (i.e. in perspective VERY FEW).

For me personally, this concept rocks! I could use this and feel confident in it. But requiring the end user to the first, last and only line of defense against malicious application providers… (well, of course, the situation right now is at least as bad) isn’t going to fly.

One could experiment with code-signing, and perhaps a browser add-on, and make a “fool-proof” user interface, hiding away the underlying public key cryptography that would be needed, but somewhere along the line the user would still need to know someone who could read the code, could sign it, and then act as a trusted verifier.

My thoughts on what would be easier to teach the end user; public key cryptography or javascript? Neither… :(



Finally, a random assortment of links I found in various places during the week:

The Bun Protocol
Laptop Bubbles
Hybrid Core, A WordPress Theme Framework
201 ways to arouse your creativity


Revelation of the week: Thanks to the “Laptop Bubbles” post I realized that I now consider bash (shell scripting) my primary language, thus displacing Python to a second place.

My software stack revisited – Summary and Future

Wednesday, December 29th, 2010

This is the last post in the series. It has taken me a while to write, edit, and then rewrite because I didn’t like the first outline (which I didn’t realize until I got to the end).

The software I have listed in these posts are there for a reason. While they work well in their own right, when put together, the whole forms something greater than the individual pieces.


Most of the software on the list is lightweight, almost a requirement as one of my systems is a netbook, and although the resources on my desktop seem almost infinite in comparison, I want to be able to work with my things on the netbook as well.


I go for CLI-apps rather than GUIs because that, more often than not, keep my fingers planted on the home row, instead of the right hand scurrying off to the mouse ever so often. Because, more often than not, the CLI-apps are more lightweight than the lightest of GUIs. And because more often than not, a CLI takes up less screen space to convey the same amount of information, as a GUI does.

CLI-based programs will also work when the X-server has broken down, and your stuck in a TTY, or you have your software on a server and are accessing it over SSH from a computer where either you or the system is unable to install the software.


I prefer plaintext over binary storage formats because it leaves me in control. If the worst comes to the worst, I can retrieve the data I need from a simple text-editor.

Not to mention that plaintext is easier to diff, which is good when you check it into version control and run into a conflict.

In the case of LaTeX, one of the best practices I picked up was to let each chapter be its own file, and then have a master-file which includes the chapters.

What this means is that when you want feedback on a specific chapter, you only need to send that, and not the entire project, which is bandwidth friendly.

And please, no comments about fancy gigabit infrastructure all over. You try sending an email or surfing on a long train ride when there are 10+ douche bags hogging all the bandwidth listening to Spotify…

Plaintext is also rather easy to parse. Combine this with a shell script, and your imagination is your only limit to the stuff you could do.

The future

There are definitely more uses for the server, such as VPN and perhaps a mumble server. Time will tell. I do know that I will spend the next couple of days trying to configure nginx and uwsgi so I could start doing some Python web-stuff.

I have been considering dropping Pidgin and instead give bitlbee a try. This would mean that all communication (with the exception of email and microblogging) would instead go through irssi. That would mean one less GUI-app, and also that all the logs would be gathered in one place (on the server) instead of the current situation where Pidgin logs are interspersed between the desktop and the netbook.

I have in a similar fashion thought about replacing Thunderbird with mutt. It would mean that I would replace one software (Thunderbird) with five (mutt, msmtp, offlineimap, abook and mairix).

Edit: Something very similar to this. (Thank you Thomas)

Well, four anyway, abook is already installed and operational.

msmtp is used to send emails (and thusly communicates with an smtp server.

offlineimap is used to sync an IMAP mailbox (fetch new mails, mark mails as read or deleted on the server, etc)

mairix is a full text search engine for Maildirs.

One of these days, I will also try to become better friends with zsh. It has many nice features which I think could increase my efficiency at least somewhat.

That’s all folks, I hope you have enjoyed reading the series as much as I enjoyed writing it. Comments are enabled in all the post, feel free to add questions or suggestions.