One of the ways that I’ve been ramping up on the remote server idea is to test with a friendly machine nearby.

And something interesting happened…

Browsing the library through Plex/Web, through an iOS device, or through the Roku is actually significantly faster from the remote server.

Also; Videos start quicker, and playback is more responsive.

Why is this?

The setup we are testing is as follows:

Setup 1: A typical older server wired in locally

  • As mentioned before, this server is local; physically connected via Gigabit (1000Mbps) ethernet
  • Mini-ITX server, running Ubuntu, with SATA Hard drives
  • It runs Plex, and does all the downloading in the house
  • Normally, Plex has access to 98% of the CPU’s 4 virtual cores

Setup 2: A new iMac connected remotely

  • The connection here has to go through the public internet, with 15/15 FiberOp on both sides
  • i7 iMac, OSX 10.8, Fusion drive
  • This computer is rarely taxed, so Plex has access to 95% of the CPU’s 4 cores
  • Also of note, this computer is connected via wifi and is regularly asleep (Using any Plex client wakes it up)

Initial Conclusions

It’s quite obvious that if there is enough bandwidth between the server and client to do a HD stream, that the performance of the computer is a far greater factor on how much of a pleasure Plex is to use.

Plex/Web feels snappy, with thumbnails posters, and fanart loading almost instantly (subjectively, I would say that this is a 300% improvement)

With the Roku, pressing Play on a local-server file that does not need transcoding, there is a small wait of a couple seconds. I thought this was simply a cost of using the Roku this way…

However; even when pushing the limits, by pressing play on a high-bitrate 1080p file that needs transcoding, the remote iMac responds and the file starts playing while barely showing the Roku’s loading screen (it feels instant).

This requires further testing, but it seems that if you have a choice between upgrading your home server, or start looking at renting a remotely-hosted server,

it could actually work out better all around to rent a remote server instead.

Interesting Note:

When running a device with Airplay on the same network as a Mac without a way to output sound, 10.8 will output to the airplay device by default.

Running the Raspberry in the other room while building a new mac,

we heard an echo of mouse click sounds

before realizing what happened.

Bitcasa; the service that I struggle with supporting…

In theory, everything sounds good, and the end result is so tantalizing

One of the usage mistakes I made early on was to use Bitcasa for a ‘temp’ folder; the TV Shows – Downloaded folder.
The main use of this folder is to hold currently running shows that are not going to be archived, meaning it goes through a lot of usage in a week as episodes are filled in, watched, and deleted.

But, forces were conspiring

Bitcasa pulled the Ubuntu client, claiming data loss when used with any other client, and all talk of a headless version has died out.
Also, Plex and Bitcasa are not super friendly towards each other, with issues scanning in new episodes (among other minor quirks).

I personally think that this issue with new episodes is because of the way Bitcasa handles ‘infinite’ drives in linux: basically mounting a folder ‘on top’ of the old one, leaving all the old content there to wait for the day that you deactivate and remove it from Bitcasa (I did this today and had to bring the folder back up to date).

 

Some lessons to move forward with:

  1. If you plan on using the current Ubuntu client (if you can find it), plan on only using that and the read-only clients like the mobile app or website.
  2. If you really want an infinite folder, create a new, empty one and transfer data in to it: Do not turn an existing folder infinite, it’s just confusing and a waste of space.
  3. If you want to use Bitcasa and Plex, start with the free plan, and get seriously involved in both forums.
    There are things Bitcasa should be doing, and there might be things that Plex can do to work around what Bitcasa should be doing.

 

I’m still not giving giving up, of course

The idea of paying $20/month + $10/month for all this:

  • automatically handle all the downloading,
  • transcoding,
  • and serving to
    • RasPis,
    • Rokus,
    • iPhones,
    • iPads,
    • Friend’s AppleTVs (through AirPlay),
    • and every other device the Plex supports
  • never having to worry about bandwidth complaints
  • or content notices that the ISP has cooperated on
  • keeping the home FTTH connection super responsive by keeping the concurrent connections as low as possible

as well as the ability to stop maintaining an in-home server if I choose to…

is just so much better than the old way.

As I mentioned last time, one of the things to be aware of when running RasPlex for the first time is how very slow the UI is before the cache is built..

With our larger library, it took a few days to cache everything properly.

Note: Based on the speed of development, this is going to be outdated information VERY quickly.

I’ve been doing a bunch of other projects, so now need to update, and want to preserve the cache (of course).

So, since I have to go through the process anyway, it’s sharing time.

  1. ssh in to the Raspberry (default username/password: “plexuser” / “rasplex” (no quotes) )
  2. Transfer the ‘/storage/.plexht’ folder to your local computer
    (I would normally suggest using gzip, but that would involve the tiny CPU, so I personally just transferred the whole folder with rsync. SCP and other SFTP programs are good choices too)
  3. Use the GUI Installer to update RasPlex (which is a nice choice that only means pressing a couple buttons)
  4. Boot up the updated version
  5. Transfer everything back, overwriting the updated ‘/storage/.plexht’ folder

And, a pleasant bonus:

This process keeps the myPlex login info, and all other Plex personalizations.

Honestly, it’s been a little bit since researching and trying new HTPC software has pulled me in, and recently I’ve been shown

just how wrong it was….

In previous posts, I talked about how the dream setup was to have a lightweight Plex client installed on the RasPi, so that it would just be a matter of the well-established Plex server serving content to any number of RasPis automatically.

Recent work, however, has blown this entirely out of the water!

In what seems like the blink of an eye, RasPlex has sprung up, bringing a complete port of the new codebase of Plex/HT to the RasPI version of OpenELEC.

Seriously; the entire (Next generation) Plex desktop  client is now on the Raspberry.

Even the site is flash, yet looks like it works with it’s hands – very nice.

Initial impressions: Fantastic.

One caveat: The FAQ warning at the bottom of this page mentions having to ‘warm up’ the cache in order to have a responsive UI, but it’s not a well-known fact, and the first boot of RasPlex had me thinking that my Raspberry was slacking – 3-4 second response times to the plex remote, taking over 7 seconds to start a video, grinding to a halt when AirPlay was tested…

With with size of our Plex library, it was still caching 30 minutes later, and was still fairly sluggish before getting up to this speed a while later (left it on all night).

TADA! Raspberry Pi and Plex; together in the future at last!

After having used Bitcasa for the past two months, the potential is obvious.

There are still a few things to work out, however…

First and foremost: there is no ‘headless’ client.

Using the Ubuntu client on a headless server requires installing the X Libraries, a window manager, and a VNC server so that it runs in it’s own (networked) GUI.

The client itself feels unpolished, and moving a large folder from a ‘backup’ folder to an ‘endless’ folder requires re-uploading the files.

Moving forward,

thinking towards having a cloud server downloading files and serving them through Plex to all the different devices, including Raspberrys (Raspberries?); something like Bitcasa makes more and more sense: Download to an infinite drive, and you can move servers at will, or scale up/down on the fly without having to worry about shuffling terabytes of files between instances…

If you are not already aware, Bitcasa is a service that promises Infinite storage for what will be $10/month.

The pricing and business model they have created is based on the idea that they can encrypt everything in such a way that if you have a copy of “ubuntu-12.10-desktop-amd64.iso”, and other users have a copy of “ubuntu-12.10-desktop-amd64.iso”, then Bitcasa only stores 1 ‘encrypted’ copy, thus saving hard drive space.

I use the term ‘encrypted’, because

when we talk about media, there is a potential issue;

if you were to download “Life of Pi (DVDscr)” (the most pirated movie this week, according to the spectacular people at TorrentFreak), and the MPAA downloads the same copy, they can approach Bitcasa and Bitcasa has the technology to see every person with this file in their infinite folder, even though the encryption means they cannot see ‘inside’ the file.

There are a few sneaky things we could potentially do to make up for this pitfall, but at least two of them mean we’d sacrifice Bitcasa the company in order to use Bitcasa the service.

As far as testing goes, I ran Bitcasa on my headless server ( thanks Matt Harrison ), and uploaded 200GB for testing. It took a couple of days on our 15/15Mbps connection,

and within the first 5 minutes of testing it actually works as well as the videos made it seem;

without giving it time to cache content, I was able to watch and scan through a 4GB 720p mkv with maybe 3 seconds between opening it and seeing the first frame of video.

Very interesting…

Doubly interesting if/when we get a version of Bitcasa that runs directly on the RasPi…

So, we didn’t meet the goal, and as I said before, this means I’m going to focus on something else for the time being.

.. and that’s ok.

We still believe that a cloud-based media server is a great idea, especially with the benefits to security, and if the solution can be about $20 a month.

And for the time being that’s going on the back burner.

I may personally try a few things over the coming months, posting as I go, and maybe we’ll pick this up again during hibernation season.

On the Raspberry Pi front, there are some very promising projects to focus on.

It looks like Android on the RasPi is dead in the water due to Broadcom’s pretend-open-source drivers for the GPU (they only let it accelerate video, not any other sort of GPU task), and this has shuffled some development talent on to more rewarding (and interesting) projects.

We’ve got pyplex, continued work with XBMC, crazy storage options, and even some work on creating a version of OpenELEC that runs a RasPi-optimized version of Plex/HT (Plex Home Theater).

Expect the next phase of Raspberry Pi HTPC to be evolutionary, not so much revolutionary; I’ll keep up to date guides for anything I’m actively working on, and keep you posted as more interesting things develop.

One interesting thing to note, since we’ve now had a Roku for a while; there have been a few times where HD content could not be transcoded fast enough, and we flipped over to the Raspberry in order to watch it in smooth full quality; the Roku seems to be artificially limited by what level of h.264 it supports, whereas the Raspberry comes in like a lightweight fighter, ready to dodge each punch and deliver more power than it looks like it’s capable of.

I said that if we had at least 15 people in by the 21st.

There are only a few more hours until that deadline is up, and we’re almost there.

What we’re looking to accomplish:

  1. A server that downloads automatically, as soon as each episode is online
  2. the metadata is then automatically applied, and it becomes available to watch
  3. currently, the plan is to download everything in HD, and then shrink it if you want to watch it on a smaller device or with a weaker connection
  4. we plan to support all sorts of devices: RasPi, Roku, Laptops/Desktops, smart TVs, android, iPhone/iPad/iPad mini, and even a web interface for on the fly content grabbing
  5. everything will be documented on this blog, and in the private mailing list, so that anyone can set up a similar system

It’s worth noting as well, that so far the system would let you download anything you can watch, without drm or strange barriers thrown in your way.
Also, no ads. That means no popups, popunders, banners, etc.

There are all sorts of really interesting things to try as we really push the limits of what’s possible…

If you want in, send at least $20usd via PayPal to linkingparents@gmail.com.
If we do not hit 15, then all money will be refunded and I’ll move on to something else.

In case you’re curious, here’s a little background on why I’m a good choice for this project;
I’ve been in tech for over quite a few years, and have been running a server at home for almost all of that time. 90% of the work of the server has always been media, with quality climbing up slowly every time a new codec is released or I get my hands on more capable technology.
9-5 (or 6-9+), I’ve always worked best when directly plugged in to business owners on one side, and large-scale server setups on the other. This has given me the skills of strong documentation and communication. Usually going overboard on documenting the technical parts so that I can translate for the non-technical people more easily.
This will be my first community-funded project, and also the first public project that has grown organically from a personal project that really begs to be done right, from the ground up.
I’m looking forward to the journey, and to watching personal media take flight in to the clouds. (don’t worry; cheesy analogies are not standard lol)

MEGA Launched today to much interest (500,000 accounts created already), and with the inspiring statement that “The Internet belongs to no man, industry, or government”.

The idea is simple to explain, complicated to pull off: Let people store whatever they want, and make it encrypted in such a way that no one can see the data unless that single user decides it.

This, of course, means that MEGA can’t see it, nor can any government agency in the world.

Interesting.. You said something about video?

To keep the blog on task, the question is: Will this work for all my video?

4TB of storage for $29.99/mo is decent enough to consider, if it means never having to think about security again.
Home movies, recordings for court cases, and hypothetically: even movies that the MPAA does not want you to have.

However, as it stands, MEGA does not fit in to any solution for actually watching video content.

To be fair; this is just not what it’s made for.

The way it works means that in order to view any file stored on mega, it must first be downloaded to your computer, then decrypted.

Very much a “file locker” where everything is locked away safe until you need it, then it’s locked again when you walk away with whatever you came for.

Of course, this means that streaming files is completely out, and it’ll take some serious work (assuming anyone is interested in doing the work)  before there’s a program that can talk to MEGA, download the file, decrypt it, then show it to you.

For now, everything is accessed through the browser, and while there is an API, it’s going to really come in to it’s own for encrypting communication, not static files.

I’m going to try it out, simply because it’s worth seeing how it’s done, but to be honest; it just doesn’t fit in to anything I do right now, and it won’t until it’s as seamless as Dropbox or Bitcasa, with mobile apps and everything.

 

 

Aside from all this, there is one thing I want to say about the Terms of use and Privacy policy:
There have been some people saying that it’s worrying that MEGA collects and stores IP addresses, communication logs, site usage, etc. People have also expressed concern that MEGA does so to help serve advertising.
While there have been scares before, and I understand the initial hesitation; the way that MEGA works means that theoretically they could collect every shred of data they have access to, and still never have any idea what you have in your files, or even what the files are named.
In my opinion, it’s better for them to own up to being a for-profit company (which they are), while building a solution that cannot be policed.

Follow

Get every new post delivered to your Inbox.