I am working in ionic framework. Currently designing a posts page with text and images. User can post there data and image and all are secure.
So, i use base 64 encoding and save the image in database.
encodeURIComponent($scope.image)
Each time when user request, i select rows from table and display them along with text and decode them.
decodeURIComponent($scope.image)
with HTML "data:image/jpeg;base64,_______"
conversion.
Works fine, but take so much time that i expected. Hence, image are 33% bigger size, and totally looks bulgy.
Then i decide to move on file upload plugin of cordova. But i realize, maintain file in this way is so much risk and complected. I also try to save binary data into database. But failed.
Text selecting without base64 data are dramatically reduce time. If it is possible to select image individually in another http call, after selecting other column and display. Is it a right mechanism to handle secure images?
As a rule of thumb, don't save files in the database.
What does the mysql manual have to say about it? http://dev.mysql.com/doc/refman/5.7/en/miscellaneous-optimization-tips.html
Don't save base4 encoded files in a database at all
As you discovered, unwanted overhead in encoding/decoing + extra space used up which means extra data transfer back and forth as well.
As @mike-m has mentioned. Base64 encoding is not a compression method. Why use Base64 encoding is also answered by a link that @mike-m posted What is base 64 encoding used for?.
In short there is nothing to gain and much to loose by base64 encoding images before storing them on the file system be it S3 or otherwise.
What about Gzip or other forms of compression without involving base64. Again the answer is that there is nothing to gain and much to lose. For example I just gzipped a 1941980 JPEG image and saved 4000 bytes that's 0.2% saving.
The reason is that images are already in compressed formats. They cannot be compressed any further.
When you store images without compression they can be delivered directly to browsers and other clients and they can be cached. If they are compressed (or base64 encoded) they need to be decompressed by your app.
Modern browsers are able to display base64 images embedded to the HTML but then they cannot be cached and the data is about 30% larger than it needs to be.
Is this an exception to the norm?
I presume that you mean a user can download images that belong to him or shared with him. This can be easily achieved by savings the files off the webspace in the file system and saving only the path in the database. Then the file is sent to the client (after doing the required checks) with fpassthru
What about when I grow to a 100000 users
Use a CDN or use a file system that's specially suited for this like BTRFS
Yes Indeed. Use it to the fullest by saving all the information about the file and it's file path in the database. Then save the file itself in the file system. You get best of both worlds.
I would suggest you to continue with base64 string only, you can use LZ string compression technique to reduce the string size. I've been using and it's working pretty well.
I don't know how am I near to your question, but hope this will help you out. Here is LZ compression technique : https://github.com/pieroxy/lz-string/
Since it's just personal files, your could store them in S3.
In order to be safe about file uploads, just check the file's mime type before uploading for whatever storage you choose.
http://php.net/manual/en/function.mime-content-type.php
just run a quick check on the uploaded file:
no big deal!
keeping files on the database is bad practise, it should be your last resource. S3 is great for many use cases, but it's expensive for high usages and local files should be used only for intranets and non-public available apps.
In my opinion, go S3.
Amazon's sdk is easy to use and you get a 1gb free storage for testing. You could also use your own server, just keep it out of your database.
Solution for storing images on filesystem
Let's say you have 100.000 users and each one of them has 10 pics. How do you handle storing it locally? Problem: Linux filesystem breaks after a few dozens of thousands images, therefore you should make the file structure avoid that
Solution: Make the folder name be 'abs(userID/1000)*1000'/userID
That way when you have the user with id 989787 it's images will be stored on the folder 989000/989787/img1.jpeg 989000/989787/img2.jpeg 989000/989787/img3.jpeg
and there you have it, a way of storing images for a million users that doesn't break the unix filesystem.
How about storage sizes?
Last month I had to compress a 1.3 million jpegs for the e-commerce I work on. When uploading images, compress using imagick with lossless flags and 80% quality. That will remove the invisible pixels and optimize your storage. Since our images vary from 40x40 (thumbnails) to 1500x1500 (zoom images) we have an average of 700x700 images, times 1.3 million images which filled around 120GB of storage.
So yeah, it's possible to store it all on your filesystem.
When things start to get slow, you hire a CDN.
How will that work?
The CDN sits in front of your image server, whenever the CDN is requested for a file, if it doesn't find it in it's storage (cache miss) it will copy it from your image server. Later, when the CDN get's requested again, it will deliver the image from it's own cache.
This way no code is needed to migrate to a CDN image deliver, all you will need to do is change the urls in your site and hire a CDN, the same works for a S3 bucket.
It's not a cheap service, but it's waaaaay cheaper then cloudfront and when you get to the point of needing it, you can probably afford it.