Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
Cloud Data Storage Graphics

Comparing Cloud-Based Image Services For Developers 28

Nerval's Lobster writes "As Web applications grow in number and capability, storing large amounts of images can quickly become a problem. If you're a Web developer and need to store your client images, do you just keep them on the same server hosting your Website? What if you have several gigabytes worth of images that need to be processed in some way? Today, many developers are looking for an easy but cost-effective solution whereby images can be stored in the cloud and even processed automatically, thus taking a huge load off one's own servers, freeing up resources to focus on building applications. With that in mind, developer and editor Jeff Cogswell looks at a couple different cloud-based services for image storage and processing. At first glance, these services seem similar—but they're actually very different. He examines Cloudinary and Blitline, and encourages developers to take a look at ImageResizer, an open-source package that does a lot of what proprietary services do (you just need to install the software on your own servers). 'If you're not a programmer but a web designer or blogger, Blitline won't be of much use for you,' he writes. 'If you are a developer, both Cloudinary and Blitline work well.' What do you think?"
This discussion has been archived. No new comments can be posted.

Comparing Cloud-Based Image Services For Developers

Comments Filter:
  • you mean there's free tools to resize images? reaaally?

    they're also just files as any other so uh why use just image hosting solution? any reason? to get crappy re-encodes of your jpegs delivered? are they so much cheaper than just s3?

  • It isn't as if adult websites haven't been using 3rd-party hosting of images and media for ages. You can't play a video without permission to foocdn.com.

    The only thing that seems to have changed is the buzz words. Cloud my ass.

    Oh, and Beta still sucks.

  • by corychristison ( 951993 ) on Monday February 10, 2014 @01:47PM (#46210935)

    As a veteraned web developer, I understand the idea... but is it really necessary?

    The biggest issue I see is if the cloud service has a blip, or is simply slower than serving from your own servers.

    In the past I've set up nginx strictly for serving static content (as it does it better than most) under a subdomain. This method is probably a good "in the middle" when it comes to serving the files. And, lets face it, storage is cheap. A couple of servers with a load balancer would be less prone to problems than running your own site on your own server(s) then subbing ouy the image hosting, storage and manipulation to some cloud services.

    Unless you're dealing in resolutions higher than 20,000 px (X or Y) and they can manipulate the files and serve them faster I really don't see the need.

    • by mveloso ( 325617 )

      I assume that the problem is that at some point you'll be serving out gigs of images a second, and storing terabytes of them. If you want to do that in a redundant, cost-effective way the cloud ones may be a solution.

      Or, you could just get a few dedicated servers that are on a provider that provides unlimited bandwidth. It'd be cheaper, but it'll be a bit more work for your operations people, since you have to manage them in DNS and make sure they're mirrored/redundant. That shouldn't cost more than a coupl

      • by Lennie ( 16154 )

        Really ? What is wrong with doing your own resizing and then letting a CDN handle the rest ?

        I think there are services out there for video recoding and resizing, those to me seems like the kind of service much more likely to be outsourced then doing it yourself.

        Anyway, speed of development and outsourcing is big business.

        Lots of developers outsource sending email for example.

        It seems kind of crazy, but that's what the world is coming to.

        The reason is, if you have very few people in your start up, you can fo

      • by gl4ss ( 559668 )

        yeah but why would you pay a 3rd party service to buy space from amazon and run a resizer script when you can just buy the space from amazon?

        if you're serving out gigs of images a second then it should be worth your time to buy the cloud time at the cheapest rate..

    • Storage is cheap but backup services in a datacenter sure isn't. Not when you are gigs of data. Terebytes? Time to get out your checkbook brother.

      • by mveloso ( 325617 )

        fdcservers.net?

      • Personally I build my own servers and colocate them into datacenters.

        With that said, I have both enterprise grade servers (600GB SAS drives) and non-enterprise grade (2TB SATA drives). Storing things for serving up on the SATA drives is fine in most cases. The server automatically caches the popular files in RAM, reducing bottlenecks and slowdowns. RAM is also very cheap. 64GB minimum in my machines.

        Proper servers, in proper datacenters, with load balancing, and a proper DNS setup and it starts to look like

    • Even just phone images now are 8-15 megapixels. That doesn't seem like much but over tens or hundreds or thousands of users, that is a huge processing load - and at the very least I can see many sites wanting to make thumbnail versions of images to reduce network traffic.

      So to me looking at the problem in the light of not just storage but processing is a very useful take.

      • I can see that, yes. But those same phones also support client side image scaling via the HTML5 Canvas tag. (See here: http://hacks.mozilla.org/2011/... [mozilla.org])

        The fact is screen resolutions simply aren't there yet. There is no reason to upload and store the whole 15MP photo when you can only see 1/4 of it (if that).

        With the canvas tag you can actually generate a thumbnail client side as well, and upload it alongside the main image, completely offloading the imag processing and reducing bandwidth (inbound and outb

        • I can see that, yes. But those same phones also support client side image scaling via the HTML5 Canvas tag.

          That is pretty much useless. From the server side you want as large an image as possible. You also want to distribute as small an image as possible at any given moment to reduce network use - that means scaling the input, sometimes in multiple ways.

          • 12MP is roughly 4000x3000px. High end mainstream screens cant even display half of that. A 100% quality JPEG you're looking around 2.5MB. That's pretty trivial, really. If you can halve your bandwidth and completely eliminate image processing by reducing down to, say, 8MP, would you not do it?

            I wasn't saying to resize it down to 800x600 pixels, that would be crazy. 8MP ought to be enough for anyone ;-)

            • If you can halve your bandwidth and completely eliminate image processing by reducing down to, say, 8MP, would you not do it?

              Not if you might print it. But even if you did want to go down to 8MP, that still means the server needs much smaller thumbnails.

  • http://www.hanselman.com/blog/... [hanselman.com] azure storage comes with a cdn AND you can process / transform uploads with very little boilerplate using WebJobs. See the link for a code example.
  • call me back when you get up to the several peta bytes stage
  • That sounds like a great idea, but integrating it with existing WCMS may be tricky. Uploading and user (author) image browsing should be seamless and sufficiently flexible for the different needs of individual shops. For example, some shops may allow or want sub-folders and some may not. Coordinating author permissions itself can be a sticky issue. A webmaster doesn't want to maintain two different "login" services for authors (local WCMS authentication versus image cloud service authentication). Some kind

  • by Wierdy1024 ( 902573 ) on Monday February 10, 2014 @03:13PM (#46211521)

    If you care about security, you would never host user provided images on your own domain.

    Browsers ignore the file extension, and in many cases ignore the mime type when deciding how to process a URL. A malicious user could upload a dodgy swf file, but then rename it .jpg. Then the attacker gets the victim to load the malicious jpg from your domain. The swf can now read your domains cookies (same origin policy) and then return them to the attacker.

    Thats why google uses 'Googleusercontent.com'. Most big sites do it. If you care about your users, you would do it too...

  • My company uses Cloudinary to host user submitted images for some of our websites. It's easy to work with and provides options to process images in many different ways if you need to crop or apply other effects. We've found them to be very reliable and their support libraries really easy to work with. I'd definitely recommend them.

Beware of Programmers who carry screwdrivers. -- Leonard Brandwein

Working...