AWS API Keys in OSX Keychain

AWS API Keys are powerful things that you don’t want to leave lying around. Amazon’s suggestion is to keep them in ~/.aws/config. I’m not a fan of that. OSX has KeyChain, which is a secure repository for credentials and what most OSX Apps use for caching your login to various websites. This might not be the ideal solution, but it’s better than an unencrypted file in your home directory.

I’ve built a set of three scripts that will use OSX Keychain to store your AWS_ACCESS_KEY_ID and AWS_SECRET_ACCESS_KEY, and retrieve them into environment variables when needed to use the AWS API or any script that honors those environment variables. Continue reading “AWS API Keys in OSX Keychain”

AWS New Account Config

We’re getting ready to deploy our first production workload in AWS, and our AWS account team recommended we enable a bunch of auditing on our accounts in each region. That is a lot of clicking for 9 regions across three accounts.

This script will configure AWS CloudTrail and AWS Config Service in all regions, configure the logging bucket, and establish a reasonable password policy. Amazon is about to release 3 (or four) more regions in Ohio, England, Korea and India. As these regions spin up you’ll need to enable auditing trails there, even if you never plan to use the region.

The script can also be used with –status to make sure all your logging is enabled.

./ --status mydomain 49nnnnnnnnn
Using mydomain as my bucket suffix and 49nnnnnnnnn as my AWS Account Number
						Cloud Trail Status
Region 		Trail Name 		Bucket 		GlobalEvents?		Logging On?
eu-west-1 	Default 	logs-mydomain 		False 		True
ap-southeast-1 	Default 	logs-mydomain 		False 		True
ap-southeast-2 	Default 	logs-mydomain 		False 		True
eu-central-1 	Default 	logs-mydomain 		False 		True
ap-northeast-1 	Default 	logs-mydomain 		False 		True
us-east-1 	logs-mydomain 	arn:aws:cloudtrail:us-east-1:496486987401:trail/Default 		True
sa-east-1 	Default 	logs-mydomain 		False 		True
us-west-1 	Default 	logs-mydomain 		False 		True
us-west-2 	Default 	logs-mydomain 		False 		True

					AWS Config Service Status
Region 		Recorder Name 		Bucket 			Last Status?		Recording?
eu-west-1 	Default-eu-west-1 	logs-mydomain 		SUCCESS 		True
ap-southeast-1 	Default-ap-southeast-1 	logs-mydomain 		SUCCESS 		True
ap-southeast-2 	Default-ap-southeast-2 	logs-mydomain 		SUCCESS 		True
eu-central-1 	Default-eu-central-1 	logs-mydomain 		SUCCESS 		True
ap-northeast-1 	Default-ap-northeast-1 	logs-mydomain 		SUCCESS 		True
us-east-1 	Default-us-east-1 	logs-mydomain 		SUCCESS 		True
sa-east-1 	Default-sa-east-1 	logs-mydomain 		SUCCESS 		True
us-west-1 	Default-us-west-1 	logs-mydomain 		SUCCESS 		True
us-west-2 	Default-us-west-2 	logs-mydomain 		SUCCESS 		True

|        GetAccountPasswordPolicy         |
||            PasswordPolicy             ||
||  AllowUsersToChangePassword  |  True  ||
||  ExpirePasswords             |  True  ||
||  HardExpiry                  |  False ||
||  MaxPasswordAge              |  180   ||
||  MinimumPasswordLength       |  8     ||
||  RequireLowercaseCharacters  |  True  ||
||  RequireNumbers              |  True  ||
||  RequireSymbols              |  True  ||
||  RequireUppercaseCharacters  |  True  ||

This is a work in progress and as I delve deeper into account best practices I will be adding to this.

Chef on a Raspberry Pi

So OpsCode (or now GetChef) doesn’t have an omnibus installer for the ARM/Raspberry Pi, but it’s pretty easy to get it setup.

Start with the base default Raspbian (wheezy) and get it where you can ssh.
Run the following on the Pi:

sudo apt-get install rubygems bundler
sudo gem install chef —verbose

Then on your workstation this to bootstrap:

knife bootstrap -N NODENAME -x pi -P raspberry --sudo IP_of_rPi

You need to set the nodename explicitly as the hostname under raspbian is “raspberrypi” and you probably want something else. The -x and -P options are the default raspbian creds.

You will want recipes to fix the following in your cookbooks:

  • Remove the default “pi” user
  • Add your own user account
  • set the Hostname to the Nodename

Lastly, the following If condition will detect if you’re on a Pi and behave differently

if node['platform'] != "raspbian"

Quick Hack to allow any folder to be a TimeMachine Destination

Apple doesn’t let you use a shared folder as a time machine destination in regular OSX (you can buy server and get that functionality), only a full volume or Time Capsule.

But you  can use this command line trick:
0) This assumes your Folder exported is called TimeMachine
1) – AFP mount your TimeMachine share via Command-K on the Mac you want to back up.
2) Try this. It will probably fail with an error about locking:
sudo tmutil setdestination /Volumes/TimeMachine

3) On the server you want to back up to, run
defaults write \
/private/var/db/dslocal/nodes/Default/sharepoints/TimeMachine.plist \
"timeMachineBackup" '(1)'

4) Rerun
sudo tmutil setdestination /Volumes/TimeMachine

Validate you can start your backup with:
tmutil startbackup

Recovering deleted files on an iPad

So I come home yesterday to a very, very dejected child. Apparently she somehow deleted all the slides for a school project due when she gets back from vacation. First off – iOS doesn’t have .Trash. Second, she hadn’t synced to iTunes in several months. Third, iCloud backup wasn’t turned on.

Not expecting too much, I took her iPad and scoured the Internet to see what I could find that would act as an undelete tool for iOS.

I tried a few things I found on the internet, but they either didn’t work, or didn’t work right in a Fusion VM. I hate downloading things from questionable sources, and may very well have some trojan running on one or more of my systems right now. Ugh.

What these failed tools did tell me is that when you’re in DFU mode of an iOS device, you can apparently do things. I’ve used DFU before when restoring a jailbroken iPhone to normal, so I had an idea what was going on here. Given that I’d last used DFU in the jailbreak world, I turned my google queries on that topic to see if any of the jailbreakers had found a way to mount or access an iPad filesystem via DFU mode.

What I found was a very slick tool: Automatic SSH Ramdisk This little java app will detect a usb connected device in DFU mode, and cause it to boot a rescue image with SSH enabled. You then ssh to localhost:2022 and you’re talking to the iPad via the USB connection.

With this, I was able to scp -r the entire contents of her iPad back to my Mac. I was also able to make a copy of the iPad’s “harddrive” for even more analysis.

Looking through the contents of her iPad’s filesystem it became clear where I was going to find the files if they were still there. Each application lives in /var/media/Applications/SOME-LONG-STRING-OF-HEX. Inside that directory is the directory with the app contents, a plist with the apple-id used to buy the app along with versioning info, the icon, and whatever private local data the app creates. In this case, I was able to find copies of her deleted files in the SOME-LONG-STRING-OF-HEX/Cache directory and extract those out.

My kid however gave me the impression there were still other files that were lost (not true – I had recovered everything she’d done up to that point), so I decided that it would be worth doing some image forensics. I figured that finding deleted JPGs on an iOS HFS image was probably similar to what the FBI does on a regular basis to bust child pornographers. There should be some good linux tools to scan a disk image for the markers of graphics files.

I found two that looked promising: Foremost and Scalpel. I had a bit of a challenge getting the disk image off the iPad. “dd” wasn’t available on the rescue image, but I was able to ssh into my iPhone and copy it off of there onto the rescue image. Amazingly that worked (given the different iOS versions and chipsets). I also had a hell of a time actually opening any of the /dev/disk files. Eventually I hit upon using “ssh -p 2022 root@localhost "cat /dev/rdisk0" | dd of=ipad.img ” as the workable method. An hour or so later, I had a 16G image file on my Mac.

Next step was to get that over to an Ubuntu box, and apt-get install foremost. man foremost for instructions, but I found that dd if=rdisk0.img | foremost -Tvd -o recovered_stuff worked best for me. That recovered about 2800 or so files. Most of them were png files consisting of icons for applications. Foremost never found any of the images in her Photos, or any deleted Photos. Scalpel was based on foremost, so I tried it next. That required a compile and editing the config file to enable looking for png & jpg headers. The command here was scalpel -i file_with_name_of_image -o recover-target -c ./scalpel.conf, but I suggest reading the man page too. Scalpel didn’t find any more files to recover than foremost did.

I wasn’t expecting much from the image level scanning. I’m not 100% sure the nature of the iDevice storage, but given it is flash memory, it probably has the same wear-leveling/trimming that occurs with PC SSDs, and that the flash will begin to zero out the blocks as soon as the files were deleted so they’ll be ready to accept new data. (Updated to add this link I had laying around: SSD firmware destroys digital evidence, researchers find | Flash Memory | Macworld.)

The moral of the story here is (as always) MAKE BACKUPS!. However if you didn’t take sysadmin 101, there is still a chance your files (or older versions) are lurking around inside your iOS device and could be recovered.

What I determined is that I need to build a throw-away windows VM that I can snapshot and revert as I try these random things I download off the internet, and that I also need a Linux forensic VM laying around with enough memory and storage to analyze these things.

I did find one useful tool for getting easy access to the iPad’s filesystem: iExplorer is a Windows or OSX tool for browsing files on the device. You can get direct access to the media files, and you can browse the contents of all your Apps. You can even FUSE mount the filesystem and browse it via a shell.

Number of the Week: PCs Make Americans $500 Billion Richer – Real Time Economics – WSJ

The Federal Reserve Bank of Atlanta did an analysis of how much annual benefit the average american get from their PC. $1700.

$1,700: The annual benefit the average American derives from personal computers

Despite all the wrenching change the computer age has brought, humanity is probably better off than it would have been if the PC had never been invented. Now, economists at the Federal Reserve Bank of Atlanta have taken a stab at figuring out exactly how much better off we are.

The economists — Karen Kopecky and Jeremy Greenwood – traced the history of the computer market back to the introduction of the Apple II in 1977 to calculate how much value, or “utility”, American consumers derive from a given amount of computing power. They then looked at how much we actually paid for that computing power, in the form of desktop PCs, laptops, notebooks , software and so on. The difference, known as the “welfare gain”, is the benefit we get from personal computers above and beyond what we pay for them.

Back in the days of magnetic-tape memory, the annual benefit was pretty small — somewhere between zero and about $6 for the average American, adjusted for inflation, depending on the method of calculation. But by 2009, the price of computing power had fallen more than 99.8% and personal computers had become a lot better and more widely used. As a result, the welfare gain rose to somewhere between $1,300 and $2,100 per person, the economists’ estimates suggest. Ballpark average: $1,700.

That’s a massive benefit, adding up to about $500 billion, or 5% of total consumer spending in 2009.

To be sure, the economists’ estimates are based on some assumptions that, while common in the world of economics, are open to debate. For one, they assume that people are extremely rational, and always buy exactly the number of personal computers that maximizes their utility. To the extent that irrational impulses drive people to buy computers, or to the extent that the use of computers entails costs people don’t recognize say, attention-span deficits or Internet addiction, then the actual benefit could be significantly smaller.

That said, those who want to test the estimates can pose themselves a question: How much money would somebody have to give you to take away all your personal-computing gadgets permanently? If it’s a lot more than you paid, Ms. Kopecky and Mr. Greenwood are probably not too far from right.

WordPress update monitoring

If you’re reading this, you know I rarely visit my blogs. That presents a problem, as I never get the nag from WordPress that my version is so out of date, my site has been taken over by Russian Yakuza using it to spy against the Chinese on behalf of Syria or something. Below is a simple little script that can be thrown in cron and will bug you when WordPress releases a new version and you’ve not updated.

DIRS="Insert list of directories with wordpress here"

for dir in $DIRS ; do
if [ -f $current_file ] ; then
current=`grep ^\\$wp_version $current_file | awk '{print $NF}' | sed s/\;//g | sed s/\'//g`
survey_says=`wget -O - -o /dev/null$current`
if [ $survey_says != "latest" ] ; then
echo "$dir Needs an upgrade!!!!"
echo "Currently $current"
echo "$current_file does not exist!"

The key here is the URL “”. Append a version number to the end of that, and it will tell you if you’re at the latest or need to upgrade.

How to configure Netatalk on Ubuntu to be TimeMachine Server

Seems Pretty Straight Forward:
sudo apt-get update
sudo apt-get install netatalk
sudo apt-get install avahi-daemon

/huge/TimeMachine "TimeMachine" options:tm to

Create /etc/avahi/services/afpd.service with contents:
<name replace-wildcards=”yes”>%h</name>

Restart everything:
sudo service netatalk restart && sudo service avahi-daemon restart