Beginning NodeJS for PHP Devs: Getting Started Locally

You’re no stranger to locally hosted php. Especially using MAMP or XMPP. Here is how you get started locally with NodeJS.

Getting started setting up your first node.js app up locally on OS X is so easy it makes my eyes tingle. I have always developed on a Mac, so if you are interested in windows, it’s probably best to find someone who specializes in that. If this doesn’t concern you, read on!

Continue reading “Beginning NodeJS for PHP Devs: Getting Started Locally”

MAMP Pro /etc/hosts file problems – Simplified

Lately I have been using a lot of host aliases for testing sites locally using my MAMP pro install. Adding local sites that answer to their actual site names is amazing for a development environment, especially if you find yourself testing javascript based apis that require oauth. Like the facebook or twitter apis.

MAMP Pro does have a problem however with not always turning off the entries in the /etc/hosts files when you stop the servers, so you have problems accessing the real site. Fixing this requires opening the hosts file manually and commenting or uncommenting the desired entries. A tedious pain.

So I decided to write a quick shell script that would allow me to reset the hosts file with a quick terminal command. See the one line install command and usage below.

Shell script for cleaning/resetting /etc/hosts files messed up by MAMP Pro — Gist.

Override Robots.txt With wget

I find myself downloading lots of files from the web when converting sites into my company’s CMS. Whether from static sites or other CMS platforms, trying to do this manually sucks. But, thanks to wget’s recursive download feature, I can rip through a site, and get all of the images I need, while keeping even the folder structure.

One thing I found out was that wget respects robots.txt files, so the the site you are trying to copy has one with the right settings, wget will get only what is allowed. This is something that can be overridden with a few tweaks. I gladly used it and decided to pass it along. See the instructions at the site below.
Continue reading “Override Robots.txt With wget”