By John Whelan on Wednesday, 20 November 2013
Posted in General Issues
Replies 23
Likes 0
Views 1.3K
Votes 0
My host is Arvixe. I have a pretty good understanding of how CRON jobs work but for the life of me I can't get EasySocial's to function properly for using the Amazon S3 Storage. Incidentally I know that S3 is working because I use it for JReviews.

Although the CRON seems to process - it successfully deletes user cover photos and albums, it does not transfer anything to the Amazon S3 bucket that I've set up for ES (users.musicmaker.me). It remains empty.

I finally got the cron to trigger using "lynx -source" instead of "wget" and am outputting the resulting email that is triggered by the server as follows:

****
<br />
<b>Warning</b>: SocialAmazonLibrary::putObject(): [51] in <b>/home/musicmak/public_html/administrator/components/com_easysocial/includes/storage/adapters/amazon/lib.php</b> on line <b>338</b><br />
[{"status":"200","contents":"There is nothing to dispatch currently.","time":"2013-11-19 20:35:05"}]
***

- JW
Hello John,

Ah, can you try just using a single name instead? I am just wondering why it throws an SSL error from Amazon if Amazon is recommending users to use multi level subdomain . Hm, strange.
·
Friday, 29 November 2013 10:23
·
0 Likes
·
0 Votes
·
0 Comments
·
Hello John,

Can you please provide me with the FTP access too?
·
Wednesday, 20 November 2013 12:38
·
0 Likes
·
0 Votes
·
0 Comments
·
I'll do a little more troubleshooting and see what I can come up with.
·
Thursday, 21 November 2013 13:35
·
0 Likes
·
0 Votes
·
0 Comments
·
No success yet, but here's the results of my troubleshooting:

1) Cron is set to execute every 60 seconds
2) Files are deleted from the site but nothing appears on Amazon
3) s3 bucket path seems valid

I've attached FTP details. Also, you'll notice on my 'Bands' and "Venues' menu items, that I get that pesky PHP warning related to the toolbar module. Weird.
·
Thursday, 21 November 2013 14:24
·
0 Likes
·
0 Votes
·
0 Comments
·
One more thing. If you make any changes to any files on my site, can you let me know what they are? I use Git locally to commit all changes and push them to production.

- JW
·
Thursday, 21 November 2013 14:30
·
0 Likes
·
0 Votes
·
0 Comments
·
Have you had an opportunity to take a look at my site's ES S3 issue? I'm in a holding pattern with development until I can get it working.
·
Friday, 22 November 2013 04:37
·
0 Likes
·
0 Votes
·
0 Comments
·
Hello John,

I am really sorry, I missed this thread earlier. I can't seem to login to the FTP server with the provided login credentials Can you please advise?
·
Friday, 22 November 2013 09:29
·
0 Likes
·
0 Votes
·
0 Comments
·
Try these. I just created the FTP user.
·
Friday, 22 November 2013 11:10
·
0 Likes
·
0 Votes
·
0 Comments
·
Any update? Even some suggestions for troubleshooting would be helpful.
·
Saturday, 23 November 2013 03:46
·
0 Likes
·
0 Votes
·
0 Comments
·
I'm using wget in this format:


/usr/bin/wget -O /dev/null "http://mydomain.com/index.php?option=com_easysocial&cron=true&phrase=123secret"; > /dev/null


and I'm getting these Warnings:


Warning: SocialAmazonLibrary::inputFile(): Unable to open input file: /public_html/media/com_easysocial/photos/2/10/a54cb699318a40b891e6f9fb8809c876_square.jpg in /public_html/administrator/components/com_easysocial/includes/storage/adapters/amazon/lib.php on line 338

Warning: SocialAmazonLibrary::inputFile(): Unable to open input file: /public_html/media/com_easysocial/photos/2/10/b029be7de2b08d5b7e0e876a6c236169_thumbnail.jpg in /public_html/administrator/components/com_easysocial/includes/storage/adapters/amazon/lib.php on line 338

Warning: SocialAmazonLibrary::inputFile(): Unable to open input file: /public_html/media/com_easysocial/photos/2/10/3dce67be73d6e1388991b6d9b79dadfc_featured.jpg in /public_html/administrator/components/com_easysocial/includes/storage/adapters/amazon/lib.php on line 338

Warning: SocialAmazonLibrary::inputFile(): Unable to open input file: /public_html/media/com_easysocial/photos/2/10/579df16e9109d87363c195a3bf82e858_large.jpg in /public_html/administrator/components/com_easysocial/includes/storage/adapters/amazon/lib.php on line 338

[{"status":"200","contents":"There is nothing to dispatch currently.","time":"2013-11-22 23:36:09"}]



I don't know which stream are those that has photos, it must be from the old test posted by StackIdeas Support or my test users. But all the Album Photos I created yesterday before Enabling S3 were transferred successfully to S3 bucket. My issue is in the Stream, the image path to all Photos were not updated to S3, it's still pointing to local storage.
·
Saturday, 23 November 2013 08:10
·
0 Likes
·
0 Votes
·
0 Comments
·
I still need help with my s3 problem as initially posted....

- JW
·
Tuesday, 26 November 2013 06:39
·
0 Likes
·
0 Votes
·
0 Comments
·
The FTP access still does not work John.
·
Wednesday, 27 November 2013 19:40
·
0 Likes
·
0 Votes
·
0 Comments
·
Hmmm.. Try this one instead.

- JW
·
Thursday, 28 November 2013 05:10
·
0 Likes
·
0 Votes
·
0 Comments
·
Thanks John. Hm, this is strange. Can you also provide me with the Amazon S3 logins so that I can browse the buckets?
·
Thursday, 28 November 2013 11:21
·
0 Likes
·
0 Votes
·
0 Comments
·
Thanks Mark. AWS Credentials added.
·
Thursday, 28 November 2013 13:12
·
0 Likes
·
0 Votes
·
0 Comments
·
Hello John,

i have been debugging this and the problem only happens when you have a folder name that contains ".". For some reasons Amazon seems to be rejecting this with the error code of 51. I guess it could be due to the SSL certificate because of the "multi level sub domain". If you use a single folder naming for the bucket path, it seems to work fine though. Is there any reasons for using the multi level sub domain folder?
·
Friday, 29 November 2013 02:28
·
0 Likes
·
0 Votes
·
0 Comments
·
No reason other than that is what was suggested in Amazon's instructions. I can easily rename the bucket to "users-musicmaker" or something if that works.

If that's the solution to the problem that I've been having, then that's exactly what I'll do.

- JW
·
Friday, 29 November 2013 03:04
·
0 Likes
·
0 Votes
·
0 Comments
·
I got curious with JohnW's issue with Amazon S3 and tested it out following a bucket name with . in-between words. Created it in root by the way so my bucket folder is /bucket and not /bucket/sub-bucket. My normal bucket name was originally,

escommunity

then I created a new Bucket, named it

es.community.test

and surprisingly it still worked just fine. At first, I was suspecting that the isue is with the "." in-between John's bucket name, but it is not.

I have not tested a sub-bucket though, when I get a chance I'll try it out.

I hope that helps.
·
Saturday, 30 November 2013 04:40
·
0 Likes
·
0 Votes
·
0 Comments
·
Hello,

Hm, does it work for you? It doesn't work for me locally too. It was complaining about some SSL certificate on AWS.
·
Saturday, 30 November 2013 23:53
·
0 Likes
·
0 Votes
·
0 Comments
·
Hi Mark,

Could it be the Storage Path (Storage Region)? If I remember it right, during the Beta that was the issue you fixed.

·
Monday, 02 December 2013 12:06
·
0 Likes
·
0 Votes
·
0 Comments
·
Hello,

Hm, not too sure but I do think that it probably has something to do with the SSL certificates that are generated by Amazon AWS. Did you add a cname for it?
·
Monday, 02 December 2013 17:14
·
0 Likes
·
0 Votes
·
0 Comments
·
No, I did not create any CNAME for Amazon S3. You can check my setting using LAC dev site if you want.

-Jackson
·
Tuesday, 03 December 2013 11:55
·
0 Likes
·
0 Votes
·
0 Comments
·
Hm, strange. I'll test this again locally with '.' in the bucket names.
·
Tuesday, 03 December 2013 15:40
·
0 Likes
·
0 Votes
·
0 Comments
·
View Full Post