Simple Chrooted SSH

Published on Thursday, March 29, 2007

You might be asking: why would you want to chroot ssh? Why use ssh anyways? Here are the quick answers:

  • FTP usually sucks. Unless sent over SSL, all information is sent cleartext.

  • SSH usually rules. SSH sends all data over an encrypted channel -- the main drawback is: you can often browse around the system, and if permissions aren't set right, read things you shouldn't be able to.

  • Chroot'd SSH rocks. The solution to both the above problems
.


So, let me tell a quick story.
When I started uni in 2001 I was a nerd. Still a nerd, I guess. I was cramped in my apartment on campus with like 5 boxes, most of them crappy p100s running Linux or OpenBSD. Life was good.
I started a CS degree (shifted into Business with a focus on IT), and we were told to use the school's main servers to compile our programs. The other interesting thing is that all user accounts were visible when logged in via ssh -- but hey, that is just the nature of Linux. I knew this, but asked the head I.T. person "why don't you jail the connections?" He responded quickly telling me to go away.
Well, shortly after making the comment (although solutions existed at the time being), pam-chroot was released. This is right about the time students figured they could spam everybody in the school, some 25,000 emails, quickly and easily -- 'cause all the accounts were displayed. Sweet -- now we can chroot individual ssh connections.
This quick demo will be on Debian, we'll create a pretend user named "karl." (I'll assume you've already added the user before beginning these steps). Also, the jails will be in /var/chroot/{username}

First: Install libpam-chroot and makejail

session required pam_chroot.so

kelvin@server ~$ sudo apt-get install libpam-chroot makejail


Second: makejail config file



Put the following in /etc/makejail/create-user.py:
#Clean the jail

cleanJailFirst=1


preserve=["/html", "/home"]


chroot="/var/chroot/karl"


users=["root","karl"]

groups=["root","karl"]


packages=["coreutils"]



Edit: If you need to use SFTP also, try this config:


cleanJailFirst=1

preserve=["/html", "/home"]

chroot="/home/vhosts/karl"

forceCopy=["/usr/bin/scp", "/usr/lib/sftp-server", /

 "/usr/bin/find", "/dev/null", "/dev/zero"]

users=["root","karl"]

groups=["root","karl"]

packages=["coreutils"]


As you'll see, there is a "preserve" directive. This is so that when you "clean" the jail (if you need to refresh the files, for instance), you won't wipe out anything important. I created an /html so that the user can upload their web docs to that file.

Third: configure libpam_chroot


Add the following to /etc/pam.d/ssh:
# Set up chrootd ssh

session required pam_chroot.so


Forth: allow the actual user to be chrootd


Edit /etc/security/chroot.conf and add the following:
karl /var/chroot/karl


Fifth: create/chown the chroot'd dir


kelvin@server ~$ sudo mkdir -p /var/chroot/karl/home

kelvin@server ~$ sudo chown /var/chroot/karl/home


Now you should be able to log in, via the new username karl.

Layer Images Using ImageMagick

Published on Thursday, March 22, 2007

For one of my webapp projects I'm needing to layer two images. This isn't a problem on my laptop -- I just fire up GIMP, do some copy 'n pasting, and I'm done. However, since everything needs to be automated (scripted), and on a server -- well, you get the point.
The great ImageMagick toolkit comes to the rescue. This is highly documented elsewhere, so I'm going to be brief.

Take this:





And add it to this:





I first tried to use the following technique:
convert bg.jpg -gravity center world.png -composite test.png

This generated a pretty picture, what I wanted. What I didn't want was the fact that the picture was freaking 1.5 megs large, not to mention the resources were a little high:
real    0m7.405s
user    0m7.064s
sys     0m0.112s


Next, I tried to just use composite.
composite -gravity center world.png bg.png output.png

Same results, although the resource usage was just a tad lower. So, what was I doing wrong? I explored a little and realized I was slightly being a muppet. I was using a bng background that was 1.2 megs large (long story). I further changed the compose type to "atop," as that is what appeared to have the lowest resource usage. I modified things appropriately:
 composite -compose atop -gravity center world.png bg.jpg output.jpg


This also yielded an acceptable resource usage.

The result:



A Dying Laptop

Published on Tuesday, March 20, 2007



I have the pleasure of owning an old T23 laptop. To show you how old this puppy is, the current T series is at T60, and those have been out for over a year. This laptop was made in 2012, and I picked it up somewhat discounted late in 2003. It is now March 2007, and this puppy is still rock solid.

You heard me, it is almost six years old and still working fine -- that is testimony to how well this laptop was built. There are several small cracks around the case, but nothing you would notice by just walking by. This laptop has been to more countries than many people.

I had the first problem this weekend, and it isn't even related to the laptop. The hard drive, a 30G I put in at some point, started to crap out on me. Bad sectors were everywhere, so some of the programs were slightly unhappy (e.g. I couldn't boot into X).

I'm going to buy a new laptop soon, I promise, about the time my MBA goals are reached. Until then, I'll continue to be frugal, and deal with the bad sectors. Being a good IT nerd, everything is backed up to an external hard drive (and most stuff backed up remotely).

Luckily I'm using Linux -- so was able to runs fsck/smartmontools a few times in recovery mode, make the bad blocks happy, and continue as "normal." Phew, disaster averted.

One More Point Linux

Published on Thursday, March 15, 2007

It should come as a surprise that I enjoy using Linux. For the record, the first time I booted into Linux on my own was 1997, this was just before entering high school. So, while some of my tech friends played with NT, I was rumbling with the Penguin. Starting in 2000 I was using Linux as my main operating system, sometimes supplemented by OS X, and only using Windows when the gaming urge surfaced. In 2004 I mostly dropped playing any games, which resulted in dropping Windows -- and besides for work, I haven't used it since.

For me, I'll admit, there are three things that Linux still lacks:


  • Simplistic video conferencing support

  • Video editing support

  • Gaming



I know that all of these are supported, but, in my opinion, not particularly well. Well, I don't care about any of these enough to actually need windows, but it would be nice to see them improve.

So, I'm set. I'm 100% legal (don't steal a single piece of software). And don't have to be too afraid of virus'. What prompted me to write this little excerpt? A recent article at the Washington Post scared the beejeepers out of me, and makes me wish even more for Vista to either cure security problems, or everybody move over to Linux. The article details the aftermath a virus can cause, not on damaging one's computer, but on capturing information. The author further details his experience hunting down the data. This was one of the better articles I've read, and I thoroughly enjoyed the further details. If you want a little more motivation to move to Linux (or just tighten your machine), then I suggest you take a few moments to read the articles as well.

The Risk in Risk Mitigation

Published on

Back in the day the barrier to entry for the Internet was quite high. The technology used required a steep learning curve, the equipment extremely expensive, and sometimes even hard to acquire. Fast forward to 2007 and things have certainly changed. If you know any tech people you can likely get free hosting for a small website, and even more demanding websites can be hosted for not much. The cost of dedicated servers has dropped even more. And the final kicker: web services. I've started to think of some web services not as a service, but more like outsourcing requirements.

This very dependency adds risk for a multitude of reasons, and when your entire web application platform revolves around a third party, such as is the case with mashups, you incur great risk.


One of the nice things when requirements are outsourced is the fact that risk is mitigated. I'll use SmugMug as an example. In summary, they moved their storage to Amason's S3 network, which is something I will be utilizing as well. Amazon's S3 (and other web services) continue to drive down the barrier of entry -- now you don't even need to purchase hugely expensive servers for the sole purchase of storage! If you don't need to purchase them, you also don't need to manage them. Risk mitigated.

However, continuing the slight allusion from The Other Blog's article on mashups, I see a slight problem with the outsourcing of requirements. While the following thought isn't particularly innovative: mitigating risk and outsourcing requirements creates a dependency on the third-party. This very dependency adds risk for a multitude of reasons, and when your entire web application platform revolves around a third party, such as is the case with mashups, you incur great risk.

But, as is evident by the fact that I've had stitches nine different times, I'm still going to do some cool mashups anyways, so stay tuned.

I Hate GeoIP Advertising

Published on



Remember, I'm in a coffee shop? GeoIPized Girls = have to sit with back to wall.

GeoIP is the concept that allows you to map a user's IP to the country it came from (I hacked up my commenting system just so you can see an example, if you are a visual type of person). Advertisers have long since picked up on this as a way to make their ads "better," hopefully becoming more appealing to their clientele. Sometimes the adaptation doesn't quite work as expected. This morning I was searching for an eBook on Embedded Linux (it is really hard to find English tech books here) -- and just had to snap a few screenshots. The advertisers picked up my IP, from Taiwan, and "customized" the advertisement. Needless to say, I've seen a total of five white females here over the entire last year.

This reminded me just how much crap people know about me; since I'm in a coffeeshop, I don't really care, but still. So, why do I hate GeoIP advertising?


  • It reminds me how much crap people know about me

  • It reminds me that I should be using the Tor network

  • Instead of displaying contextual ads that I even might be a little inclined to click, it reminds me how much I hate advertising

  • Remember, I'm in a coffee shop? GeoIPized Girls = have to sit with back to wall.


Being curious, I decided to see if I could teleport the girls to other parts of the world. Enter Tor, Privoxy and FoxyProxy. I turned them all on, went back and remembered just how cool these tools really are.


Python, AST and SOAP

Published on Wednesday, March 7, 2007

For one of my projects I need to generate thumbnails for a page. And lots and lots and lots of them. Even though I can generate them via a python script and a very light "gtk browser", I would prefer to mitigate the server load. To do this I've decided to tap into the Alexa Thumbnail Service. They allow two methods: REST and SOAP. After several hours of testing things out, I've decided to toss in the towel and settle on REST. If you can spot the error with my SOAP setup, I owe you a beer.
I'm using the ZSI module for python.

1. wsdl2py


I pull in the needed classes by using wsdl2py.
wsdl2py -b http://ast.amazonaws.com/doc/2006-05-15/AlexaSiteThumbnail.wsdl


2. Look at the code generated.


See AlexaSiteThumbnail_types.py and AlexaSiteThumbnail_client.py.

3. Write python code to access AST over SOAP.




#!/usr/bin/env python
import sys
import datetime
import hmac
import sha
import base64
from AlexaSiteThumbnail_client import *

print 'Starting...'

AWS_ACCESS_KEY_ID = 'super-duper-access-key'
AWS_SECRET_ACCESS_KEY = 'super-secret-key'

print 'Generating signature...'

def generate_timestamp(dtime):
    return dtime.strftime("%Y-%m-%dT%H:%M:%SZ")

def generate_signature(operation, timestamp, secret_access_key):
    my_sha_hmac = hmac.new(secret_access_key, operation + timestamp, sha)
    my_b64_hmac_digest = base64.encodestring(my_sha_hmac.digest()).strip()
    return my_b64_hmac_digest

timestamp_datetime = datetime.datetime.utcnow()
timestamp_list = list(timestamp_datetime.timetuple())
timestamp_list[6] = 0
timestamp_tuple = tuple(timestamp_list)
timestamp_str = generate_timestamp(timestamp_datetime)

signature = generate_signature('Thumbnail', timestamp_str, AWS_SECRET_ACCESS_KEY)

print 'Initializing Locator...'

locator = AlexaSiteThumbnailLocator()
port = locator.getAlexaSiteThumbnailPort(tracefile=sys.stdout)

print 'Requesting thumbnails...'

request = ThumbnailRequestMsg()
request.Url = "alexa.com"
request.Signature = signature
request.Timestamp = timestamp_tuple
request.AWSAccessKeyId = AWS_ACCESS_KEY_ID
request.Request = [request.new_Request()]

resp = port.Thumbnail(request)




4. Run, and see error.


ZSI.EvaluateException: Got None for nillable(False), minOccurs(1) element 
(http://ast.amazonaws.com/doc/2006-05-15/,Url), 



 xmlns:SOAP-ENC="http://schemas.xmlsoap.org/soap/encoding/" 
xmlns:SOAP-ENV="http://schemas.xmlsoap.org/soap/envelope/" 
xmlns:ZSI="http://www.zolera.com/schemas/ZSI/" 
xmlns:ns1="http://ast.amazonaws.com/doc/2006-05-15/" 
xmlns:xsd="http://www.w3.org/2001/XMLSchema" 
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance">

[Element trace: /SOAP-ENV:Body/ns1:ThumbnailRequest]


5. Conclusion



I'm not entirely certain what I'm doing wrong. I've also written another version but actually with NPBinding connecting to the wsdl file. It seems to work much better, as it fully connects, and I get a 200, but it doesn't return the thumbnail location in the response, and I get a:
TypeError: Response is "text/plain", not "text/xml"


So, while I have things working fine with REST, I would like to get the SOAP calls working. One beer reward.

AWS in Python (REST)

Published on Saturday, March 3, 2007

As some of you may know, I have some projects cooked up. I don't expect to make a million bucks (wish me luck!), but a few extra bills in the pocket wouldn't hurt. Plus, I'm highly considering further education, which will set me back a few-thirty grand. That said, one of my projects will rely heavily on Amazon Web Services. Amazon has, for quite some time now, opened up their information via REST and SOAP. I've been trying (virtually the entire day) to get SOAP to work, but seem to get snagged on a few issues. Stay tuned.
However, in my quest to read every RTFM I stumbled upon a post regarding Python+REST to access Alexa Web Search. After staring at Python code, especially trying to grapple why SOAP isn't working, updating the outdated REST code was a 5 minute hack. So, if you are interested in using Alexa Web Search with Python via Rest, look below:


websearch.py



#!/usr/bin/python

"""
Test script to run a WebSearch query on AWS via the REST interface.  Written
 originally by Walter Korman (shaper@wgks.org), based on urlinfo.pl script from 
  AWIS-provided sample code, updated to the new API by  
Kelvin Nicholson (kelvin@kelvinism.com). Assumes Python 2.4 or greater.
"""

import base64
import datetime
import hmac
import sha
import sys
import urllib
import urllib2

AWS_ACCESS_KEY_ID = 'your-access-key'
AWS_SECRET_ACCESS_KEY = 'your-super-secret-key'

def get_websearch(searchterm):
    def generate_timestamp(dtime):
        return dtime.strftime("%Y-%m-%dT%H:%M:%SZ")
    
    def generate_signature(operation, timestamp, secret_access_key):
        my_sha_hmac = hmac.new(secret_access_key, operation + timestamp, sha)
        my_b64_hmac_digest = base64.encodestring(my_sha_hmac.digest()).strip()
        return my_b64_hmac_digest
    
    timestamp_datetime = datetime.datetime.utcnow()
    timestamp_list = list(timestamp_datetime.timetuple())
    timestamp_list[6] = 0
    timestamp_tuple = tuple(timestamp_list)
    timestamp = generate_timestamp(timestamp_datetime)
    
    signature = generate_signature('WebSearch', timestamp, AWS_SECRET_ACCESS_KEY)
    
    def generate_rest_url (access_key, secret_key, query):
        """Returns the AWS REST URL to run a web search query on the specified
        query string."""
    
        params = urllib.urlencode(
            { 'AWSAccessKeyId':access_key,
              'Timestamp':timestamp,
              'Signature':signature,
              'Action':'WebSearch',
              'ResponseGroup':'Results',
              'Query':searchterm, })
        return "http://websearch.amazonaws.com/?%s" % (params)
    
    # print "Querying '%s'..." % (query)
    url = generate_rest_url(AWS_ACCESS_KEY_ID, AWS_SECRET_ACCESS_KEY, searchterm)
    # print "url => %s" % (url)
    print urllib2.urlopen(url).read()



You run it like this:
>>> from websearch import get_websearch
>>> get_websearch('python')