With the buzz going around the web about the latest original content on netflix “House of Cards” we’ve been doing more discussion about how great it would be to have some new original content for netflix. A quote I found from not long ago by the CEO of Netflix sums it up.
“The goal,” says Hastings, “is to become HBO faster than HBO can become us.”
This has been met with the sheer expense of creating such original content. Will our $8 a month be enough to create and deliver new and amazing content? HBO already charges more than that, with additional subsidies that they get from the cable companies. There are many that don’t believe that HBO could ever go stand alone and still offer the shows that are so expensive to produce.
Why not crowd fund new shows?
The success stories for funding raised for independent video game developers, and numerous other entrepreneurs being kickstarted by crowd funding sites like kickstarter or indiegogo make me wonder if original content couldn’t be funded the same way. What would it take to create some pilots and let the community put their money up to make the rest of the series?
Better yet, in addition to having in house writers thinking up the shows, open it to public submissions. Let’s allow the youtubers and vimeoers out there a chance to get their own series funded, filmed and streamed to the millions of netflix subscribers.
This is definitely something that should be on the table. We can build it. We have the technology.
I checked this book out from the library a few months before my first trip to the caribbean. It still stands as one of my favorite books.
MAMP Pro does have a problem however with not always turning off the entries in the /etc/hosts files when you stop the servers, so you have problems accessing the real site. Fixing this requires opening the hosts file manually and commenting or uncommenting the desired entries. A tedious pain.
So I decided to write a quick shell script that would allow me to reset the hosts file with a quick terminal command. See the one line install command and usage below.
Fix the codeigniter plupload upload problem of ”You did not select a file to upload.” by making sure that you pass the correct file array key to do upload:
How I found it:
I’ve been working on my first codeigniter project that required me to upload some images. For that task I chose Plupload. A super slick upload manager written by the same people that brought us tinymce.
Decided to switch domain names, to something shorter and cooler that I think is more in line with my current endeavors. So, I just migrated everything to a new wordpress install that is sitting on a nearly stock bones css framework. Should be doing some cool stuff with it in the future. but for now. Here it is, enjoy my few posts!
Diaspora code opened up on github today.
“Git” it here: http://github.com/diaspora/diaspora
In the process of doing the install, I had some rake errors.
rake aborted! dlopen(/Library/Ruby/Gems/1.8/gems/ruby-debug-base-0.10.3/lib/ruby_debug.bundle, 9): no suitable image found. Did find: /Library/Ruby/Gems/1.8/gems/ruby-debug-base-0.10.3/lib/ruby_debug.bundle: no matching architecture in universal wrapper - /Library/Ruby/Gems/1.8/gems/ruby-debug-base-0.10.3/lib/ruby_debug.bundle
Which were fixed by running:
sudo gem uninstall ruby-debug-base; sudo gem install ruby-debug-base linecache
Distributed, controlled facebook is something I’d definitely like, especially for my family and friends. This looks promising. Maybe I’ll get some ruby in me, and dig around in it.
Just passing it along. We’ll definitely see where this road leads.
I find myself downloading lots of files from the web when converting sites into my company’s CMS. Whether from static sites or other CMS platforms, trying to do this manually sucks. But, thanks to wget’s recursive download feature, I can rip through a site, and get all of the images I need, while keeping even the folder structure.
One thing I found out was that wget respects robots.txt files, so the the site you are trying to copy has one with the right settings, wget will get only what is allowed. This is something that can be overridden with a few tweaks. I gladly used it and decided to pass it along. See the instructions at the site below.
I just got invited to forrst and realized that I spend too much time consuming web content.
Not the typical myspace, facebook updates, and YouTube crap. But lots of web “stuff” languages, tutorials, browsing stock resource sites, reading twitter posts. I spend way more time consuming content than I do creating content or participating.
So. While venturing into Coldfusion Builder I managed to delete the entire contents of my htdoc directory. 11 gigs…gone…poof. My first thought was, “I can’t believe that just happened?” Then, “Damn, with that new 7200 rpm hard drive, they disappeared 50% faster.” Not to worry, I’ve got a Time Machine backup at home. I’ll just restore it from last night and we’ll be all good.
It wasn’t that simple.