I know I can set charset in the meta tag of the HTML file, but I want to avoid it.
This is the exact one I am using Command:
s3cmd --add-header='Content-Encoding':'gzip'
--add-header='Content-Type':'text /html;charset=utf-8'
put index.html.gz s3://www.example.com/index.html
This is the error I get:
Error: S3 error: 403 (SignatureDoesNotMatch): The request signature we calculated does not match the signature you provided. Check your key and signature method.
If I delete from the above command ; charset = utf-8 part it works, but the Content-Type is set to text/html instead of text/html; charset = utf-8.
(1) Upgrade the installation of S3cmd. Version 1.0.x cannot set the character set. From master on github began to install. Master contains fixes for this (1) bug and this (2) bug, resulting in the inability to recognize the format of the content type in the earlier version and the “call before definition” issue.
To install s3cmd from the master, please do the following:
git clone https://github.com/s3tools/s3cmd.git
cd s3cmd/
sudo python setup.py install (sudo optional based on your setup)
By adding the following to .profile or. bashrc or .zshrc (again, depending on your system), make sure your python library is in your path.
export PATH=``/Library/Frameworks/Python. framework/Versions/2.7/bin:$PATH"
But if you use homemade software, it may cause conflicts-just symbolically link the executable file.
ln -s /Library/Frameworks/Python.framework/Versions/2.7/bin/s3cmd /usr/local/bin/s3cmd
Close the terminal and reopen it.
s3cmd --version
It will still output
s3cmd version 1.5.0-alpha3-but its the patched version.< /pre>(2) After upgrading, use:
s3cmd --acl-public --no-preserve --add-header="Content-Encoding: gzip" --add-header="Cache-Control:public, max-age=86400" --mime-type="text/html; charset=utf-8" put index.html s3://www.example. com/index.htmlIf the upload is successful and the Content-Type is set to "text / html; charset = utf-8", but you see this error during the process:
< p>
WARNING: Module python-magic is not available...I prefer life without python-magic – I found that if you don’t specifically set the mime type, python- Magic often guess wrong. Install python-magic but you must set mime-type = "application / javascript" in s3cmd or python-magic. If you gzip your js locally, it will guess Test it to be "application / x-gzip".
Install python-magic:
sudo pip install python-magicPIP broke the recent OSX upgrade, so you may need to update PIP:
sudo easy_install -U pipDo that. All of this applies to S3cmd sync-more than just put. I suggest you put s3cmd sync into a Thor type task, so you don't forget to set mime-type on any particular file (if you use python-magic on gzip files).
This is a gist of an example thor task for deploying a static Middleman site to s3. This task allows you to rename files locally and sync using s3cmd instead of renaming them one by one using S3cmd put.
p>
I host a static website on S3. To push my website to Amazon, I use the s3cmd command line tool. Everything is ok, except that the Content-Type is set to text/ html; charset = utf-8.
I know I can set charset in the meta tag of HTML files, but I want to avoid it.
This is what I am using The exact command:
s3cmd --add-header='Content-Encoding':'gzip'
--add-header='Content-Type': 'text/html;charset=utf-8'
put index.html.gz s3://www.example.com/index.html
This is the error I get: p>
Error: S3 error: 403 (SignatureDoesNotMatch): The request signature we calculated does not match the signature you provided. Check your key and signature method.
If I start from the above command Delete; charset=utf-8 part it works, but Content-Type is set to text/html instead of text/html; charset=utf -8.
A two-step process to solve your problem.
(1) Upgrade the installation of S3cmd. Version 1.0. x can’t set the character set. Install from the master on github. Master contains fixes for this (1) bug and this (2) bug, resulting in the inability to recognize the format of the content type in the earlier version and the "call before definition" problem. < /p>
To install s3cmd from master on OSX, do the following:
git clone https://github.com/s3tools/s3cmd.git< br />cd s3cmd/
sudo python setup.py install (sudo optional based on your setup)
By adding the following to .profile or .bashrc or .zshrc (again, depending on Your system), make sure your python library is in your path.
export PATH="/Library/Frameworks/Python.framework/Versions/2.7/bin:$ PATH"
But if you use homebrew software may cause conflicts-just symbolically link the executable file.
ln -s /Library/ Frameworks/Python.framework/Versions/2.7/bin/s3cmd /usr/local/bin/s3cmd
Close the terminal and reopen it.
s3cmd- -version
It will still output
s3cmd version 1.5.0-alpha3-but its the patched version.
(2) After upgrading, use:
s3cmd --acl-public --no-preserve --add-header="Content-Encoding:gzip" --add-header=" Cache-Control:public, max-age=86400" --mime-type="text/html; charset=utf-8" put i ndex.html s3://www.example.com/index.html
If the upload is successful and the Content-Type is set to "text / html; charset = utf-8", but you are in the process I see this error in:
WARNING: Module python-magic is not available...
I prefer to live without python-magic – I I found that if you don’t specifically set the mime type, python-magic will often guess wrong. Install python-magic but be sure to set mime-type = "application / javascript" in s3cmd or python-magic. If you gzip your js locally, It will guess that it is "application/x-gzip".
Install python-magic:
sudo pip install python-magic
< p>PIP broke the recent OSX upgrade, so you may need to update PIP:
sudo easy_install -U pip
Do that. All of this Suitable for S3cmd sync-not just to put. I suggest you put s3cmd sync into a Thor type task, so you don't forget to set mime-type on any specific file (if you use python-magic on gzip file ).
This is a gist of an example thor task for deploying a static Middleman site to s3. This task allows you to rename files locally and sync using s3cmd instead of renaming them one by one using S3cmd put.