How to update metadata of an existing object in AWS S3 using python boto3?












5















boto3 documentation does not clearly specify how to update the user metadata of an already existing S3 Object.










share|improve this question



























    5















    boto3 documentation does not clearly specify how to update the user metadata of an already existing S3 Object.










    share|improve this question

























      5












      5








      5


      2






      boto3 documentation does not clearly specify how to update the user metadata of an already existing S3 Object.










      share|improve this question














      boto3 documentation does not clearly specify how to update the user metadata of an already existing S3 Object.







      python amazon-web-services amazon-s3 boto3






      share|improve this question













      share|improve this question











      share|improve this question




      share|improve this question










      asked Sep 20 '16 at 14:36









      arc000arc000

      10319




      10319
























          3 Answers
          3






          active

          oldest

          votes


















          4














          It can be done using the copy_from() method -



          import boto3

          s3 = boto3.resource('s3')
          s3_object = s3.Object('bucket-name', 'key')
          s3_object.metadata.update({'id':'value'})
          s3_object.copy_from(CopySource={'Bucket':'bucket-name', 'Key':'key'}, Metadata=s3_object.metadata, MetadataDirective='REPLACE')





          share|improve this answer

































            0














            You can either update metadata by adding something or updating a current metadata value with a new one, here is the piece of code I am using :



            import sys
            import os
            import boto3
            import pprint
            from boto3 import client
            from botocore.utils import fix_s3_host
            param_1= YOUR_ACCESS_KEY
            param_2= YOUR_SECRETE_KEY
            param_3= YOUR_END_POINT
            param_4= YOUR_BUCKET

            #Create the S3 client
            s3ressource = client(
            service_name='s3',
            endpoint_url= param_3,
            aws_access_key_id= param_1,
            aws_secret_access_key=param_2,
            use_ssl=True,
            )
            # Building a list of of object per bucket
            def BuildObjectListPerBucket (variablebucket):
            global listofObjectstobeanalyzed
            listofObjectstobeanalyzed =
            extensions = ['.jpg','.png']
            for key in s3ressource.list_objects(Bucket=variablebucket)["Contents"]:
            #print (key ['Key'])
            onemoreObject=key['Key']
            if onemoreObject.endswith(tuple(extensions)):
            listofObjectstobeanalyzed.append(onemoreObject)
            #print listofObjectstobeanalyzed
            else :
            s3ressource.delete_object(Bucket=variablebucket,Key=onemoreObject)
            return listofObjectstobeanalyzed

            # for a given existing object, create metadata
            def createmetdata(bucketname,objectname):
            s3ressource.upload_file(objectname, bucketname, objectname, ExtraArgs={"Metadata": {"metadata1":"ImageName","metadata2":"ImagePROPERTIES" ,"metadata3":"ImageCREATIONDATE"}})

            # for a given existing object, add new metadata
            def ADDmetadata(bucketname,objectname):
            s3_object = s3ressource.get_object(Bucket=bucketname, Key=objectname)
            k = s3ressource.head_object(Bucket = bucketname, Key = objectname)
            m = k["Metadata"]
            m["new_metadata"] = "ImageNEWMETADATA"
            s3ressource.copy_object(Bucket = bucketname, Key = objectname, CopySource = bucketname + '/' + objectname, Metadata = m, MetadataDirective='REPLACE')

            # for a given existing object, update a metadata with new value
            def CHANGEmetadata(bucketname,objectname):
            s3_object = s3ressource.get_object(Bucket=bucketname, Key=objectname)
            k = s3ressource.head_object(Bucket = bucketname, Key = objectname)
            m = k["Metadata"]
            m.update({'watson_visual_rec_dic':'ImageCREATIONDATEEEEEEEEEEEEEEEEEEEEEEEEEE'})
            s3ressource.copy_object(Bucket = bucketname, Key = objectname, CopySource = bucketname + '/' + objectname, Metadata = m, MetadataDirective='REPLACE')

            def readmetadata (bucketname,objectname):
            ALLDATAOFOBJECT = s3ressource.get_object(Bucket=bucketname, Key=objectname)
            ALLDATAOFOBJECTMETADATA=ALLDATAOFOBJECT['Metadata']
            print ALLDATAOFOBJECTMETADATA



            # create the list of object on a per bucket basis
            BuildObjectListPerBucket (param_4)

            # Call functions to see the results
            for objectitem in listofObjectstobeanalyzed:
            # CALL The function you want
            readmetadata(param_4,objectitem)
            ADDmetadata(param_4,objectitem)
            readmetadata(param_4,objectitem)
            CHANGEmetadata(param_4,objectitem)
            readmetadata(param_4,objectitem)





            share|improve this answer

































              0














              You can do this using copy_from() on the resource (like this answer) mentions, but you can also use the client's copy_object() and specify the same source and destination. The methods are equivalent and invoke the same code underneath.



              import boto3
              s3 = boto3.client("s3")
              src_key = "my-key"
              src_bucket = "my-bucket"
              s3.copy_object(Key=src_key, Bucket=src_bucket,
              CopySource={"Bucket": src_bucket, "Key": src_key},
              Metadata={"my_new_key": "my_new_val"},
              MetadataDirective="REPLACE")


              The 'REPLACE' means that the new metadata on the request will overwrite the source metadata entirely. If you mean to only add new key-values, you'd have to read the original first and update.



              You can retrieve the original metadata with head_object(Key=src_key, Bucket=src_bucket). If that's the route you take, you should also pass CopySourceIfMatch=original_etag in the copy_object(...) request to preserve the same atomicity property as the single call version. If anything were to change the object between the read and the write, the copy_object would fail with a http 412 error.



              Reference: boto3 issue 389






              share|improve this answer

























                Your Answer






                StackExchange.ifUsing("editor", function () {
                StackExchange.using("externalEditor", function () {
                StackExchange.using("snippets", function () {
                StackExchange.snippets.init();
                });
                });
                }, "code-snippets");

                StackExchange.ready(function() {
                var channelOptions = {
                tags: "".split(" "),
                id: "1"
                };
                initTagRenderer("".split(" "), "".split(" "), channelOptions);

                StackExchange.using("externalEditor", function() {
                // Have to fire editor after snippets, if snippets enabled
                if (StackExchange.settings.snippets.snippetsEnabled) {
                StackExchange.using("snippets", function() {
                createEditor();
                });
                }
                else {
                createEditor();
                }
                });

                function createEditor() {
                StackExchange.prepareEditor({
                heartbeatType: 'answer',
                autoActivateHeartbeat: false,
                convertImagesToLinks: true,
                noModals: true,
                showLowRepImageUploadWarning: true,
                reputationToPostImages: 10,
                bindNavPrevention: true,
                postfix: "",
                imageUploader: {
                brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
                contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
                allowUrls: true
                },
                onDemand: true,
                discardSelector: ".discard-answer"
                ,immediatelyShowMarkdownHelp:true
                });


                }
                });














                draft saved

                draft discarded


















                StackExchange.ready(
                function () {
                StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstackoverflow.com%2fquestions%2f39596987%2fhow-to-update-metadata-of-an-existing-object-in-aws-s3-using-python-boto3%23new-answer', 'question_page');
                }
                );

                Post as a guest















                Required, but never shown

























                3 Answers
                3






                active

                oldest

                votes








                3 Answers
                3






                active

                oldest

                votes









                active

                oldest

                votes






                active

                oldest

                votes









                4














                It can be done using the copy_from() method -



                import boto3

                s3 = boto3.resource('s3')
                s3_object = s3.Object('bucket-name', 'key')
                s3_object.metadata.update({'id':'value'})
                s3_object.copy_from(CopySource={'Bucket':'bucket-name', 'Key':'key'}, Metadata=s3_object.metadata, MetadataDirective='REPLACE')





                share|improve this answer






























                  4














                  It can be done using the copy_from() method -



                  import boto3

                  s3 = boto3.resource('s3')
                  s3_object = s3.Object('bucket-name', 'key')
                  s3_object.metadata.update({'id':'value'})
                  s3_object.copy_from(CopySource={'Bucket':'bucket-name', 'Key':'key'}, Metadata=s3_object.metadata, MetadataDirective='REPLACE')





                  share|improve this answer




























                    4












                    4








                    4







                    It can be done using the copy_from() method -



                    import boto3

                    s3 = boto3.resource('s3')
                    s3_object = s3.Object('bucket-name', 'key')
                    s3_object.metadata.update({'id':'value'})
                    s3_object.copy_from(CopySource={'Bucket':'bucket-name', 'Key':'key'}, Metadata=s3_object.metadata, MetadataDirective='REPLACE')





                    share|improve this answer















                    It can be done using the copy_from() method -



                    import boto3

                    s3 = boto3.resource('s3')
                    s3_object = s3.Object('bucket-name', 'key')
                    s3_object.metadata.update({'id':'value'})
                    s3_object.copy_from(CopySource={'Bucket':'bucket-name', 'Key':'key'}, Metadata=s3_object.metadata, MetadataDirective='REPLACE')






                    share|improve this answer














                    share|improve this answer



                    share|improve this answer








                    edited Sep 21 '16 at 11:50

























                    answered Sep 20 '16 at 14:36









                    arc000arc000

                    10319




                    10319

























                        0














                        You can either update metadata by adding something or updating a current metadata value with a new one, here is the piece of code I am using :



                        import sys
                        import os
                        import boto3
                        import pprint
                        from boto3 import client
                        from botocore.utils import fix_s3_host
                        param_1= YOUR_ACCESS_KEY
                        param_2= YOUR_SECRETE_KEY
                        param_3= YOUR_END_POINT
                        param_4= YOUR_BUCKET

                        #Create the S3 client
                        s3ressource = client(
                        service_name='s3',
                        endpoint_url= param_3,
                        aws_access_key_id= param_1,
                        aws_secret_access_key=param_2,
                        use_ssl=True,
                        )
                        # Building a list of of object per bucket
                        def BuildObjectListPerBucket (variablebucket):
                        global listofObjectstobeanalyzed
                        listofObjectstobeanalyzed =
                        extensions = ['.jpg','.png']
                        for key in s3ressource.list_objects(Bucket=variablebucket)["Contents"]:
                        #print (key ['Key'])
                        onemoreObject=key['Key']
                        if onemoreObject.endswith(tuple(extensions)):
                        listofObjectstobeanalyzed.append(onemoreObject)
                        #print listofObjectstobeanalyzed
                        else :
                        s3ressource.delete_object(Bucket=variablebucket,Key=onemoreObject)
                        return listofObjectstobeanalyzed

                        # for a given existing object, create metadata
                        def createmetdata(bucketname,objectname):
                        s3ressource.upload_file(objectname, bucketname, objectname, ExtraArgs={"Metadata": {"metadata1":"ImageName","metadata2":"ImagePROPERTIES" ,"metadata3":"ImageCREATIONDATE"}})

                        # for a given existing object, add new metadata
                        def ADDmetadata(bucketname,objectname):
                        s3_object = s3ressource.get_object(Bucket=bucketname, Key=objectname)
                        k = s3ressource.head_object(Bucket = bucketname, Key = objectname)
                        m = k["Metadata"]
                        m["new_metadata"] = "ImageNEWMETADATA"
                        s3ressource.copy_object(Bucket = bucketname, Key = objectname, CopySource = bucketname + '/' + objectname, Metadata = m, MetadataDirective='REPLACE')

                        # for a given existing object, update a metadata with new value
                        def CHANGEmetadata(bucketname,objectname):
                        s3_object = s3ressource.get_object(Bucket=bucketname, Key=objectname)
                        k = s3ressource.head_object(Bucket = bucketname, Key = objectname)
                        m = k["Metadata"]
                        m.update({'watson_visual_rec_dic':'ImageCREATIONDATEEEEEEEEEEEEEEEEEEEEEEEEEE'})
                        s3ressource.copy_object(Bucket = bucketname, Key = objectname, CopySource = bucketname + '/' + objectname, Metadata = m, MetadataDirective='REPLACE')

                        def readmetadata (bucketname,objectname):
                        ALLDATAOFOBJECT = s3ressource.get_object(Bucket=bucketname, Key=objectname)
                        ALLDATAOFOBJECTMETADATA=ALLDATAOFOBJECT['Metadata']
                        print ALLDATAOFOBJECTMETADATA



                        # create the list of object on a per bucket basis
                        BuildObjectListPerBucket (param_4)

                        # Call functions to see the results
                        for objectitem in listofObjectstobeanalyzed:
                        # CALL The function you want
                        readmetadata(param_4,objectitem)
                        ADDmetadata(param_4,objectitem)
                        readmetadata(param_4,objectitem)
                        CHANGEmetadata(param_4,objectitem)
                        readmetadata(param_4,objectitem)





                        share|improve this answer






























                          0














                          You can either update metadata by adding something or updating a current metadata value with a new one, here is the piece of code I am using :



                          import sys
                          import os
                          import boto3
                          import pprint
                          from boto3 import client
                          from botocore.utils import fix_s3_host
                          param_1= YOUR_ACCESS_KEY
                          param_2= YOUR_SECRETE_KEY
                          param_3= YOUR_END_POINT
                          param_4= YOUR_BUCKET

                          #Create the S3 client
                          s3ressource = client(
                          service_name='s3',
                          endpoint_url= param_3,
                          aws_access_key_id= param_1,
                          aws_secret_access_key=param_2,
                          use_ssl=True,
                          )
                          # Building a list of of object per bucket
                          def BuildObjectListPerBucket (variablebucket):
                          global listofObjectstobeanalyzed
                          listofObjectstobeanalyzed =
                          extensions = ['.jpg','.png']
                          for key in s3ressource.list_objects(Bucket=variablebucket)["Contents"]:
                          #print (key ['Key'])
                          onemoreObject=key['Key']
                          if onemoreObject.endswith(tuple(extensions)):
                          listofObjectstobeanalyzed.append(onemoreObject)
                          #print listofObjectstobeanalyzed
                          else :
                          s3ressource.delete_object(Bucket=variablebucket,Key=onemoreObject)
                          return listofObjectstobeanalyzed

                          # for a given existing object, create metadata
                          def createmetdata(bucketname,objectname):
                          s3ressource.upload_file(objectname, bucketname, objectname, ExtraArgs={"Metadata": {"metadata1":"ImageName","metadata2":"ImagePROPERTIES" ,"metadata3":"ImageCREATIONDATE"}})

                          # for a given existing object, add new metadata
                          def ADDmetadata(bucketname,objectname):
                          s3_object = s3ressource.get_object(Bucket=bucketname, Key=objectname)
                          k = s3ressource.head_object(Bucket = bucketname, Key = objectname)
                          m = k["Metadata"]
                          m["new_metadata"] = "ImageNEWMETADATA"
                          s3ressource.copy_object(Bucket = bucketname, Key = objectname, CopySource = bucketname + '/' + objectname, Metadata = m, MetadataDirective='REPLACE')

                          # for a given existing object, update a metadata with new value
                          def CHANGEmetadata(bucketname,objectname):
                          s3_object = s3ressource.get_object(Bucket=bucketname, Key=objectname)
                          k = s3ressource.head_object(Bucket = bucketname, Key = objectname)
                          m = k["Metadata"]
                          m.update({'watson_visual_rec_dic':'ImageCREATIONDATEEEEEEEEEEEEEEEEEEEEEEEEEE'})
                          s3ressource.copy_object(Bucket = bucketname, Key = objectname, CopySource = bucketname + '/' + objectname, Metadata = m, MetadataDirective='REPLACE')

                          def readmetadata (bucketname,objectname):
                          ALLDATAOFOBJECT = s3ressource.get_object(Bucket=bucketname, Key=objectname)
                          ALLDATAOFOBJECTMETADATA=ALLDATAOFOBJECT['Metadata']
                          print ALLDATAOFOBJECTMETADATA



                          # create the list of object on a per bucket basis
                          BuildObjectListPerBucket (param_4)

                          # Call functions to see the results
                          for objectitem in listofObjectstobeanalyzed:
                          # CALL The function you want
                          readmetadata(param_4,objectitem)
                          ADDmetadata(param_4,objectitem)
                          readmetadata(param_4,objectitem)
                          CHANGEmetadata(param_4,objectitem)
                          readmetadata(param_4,objectitem)





                          share|improve this answer




























                            0












                            0








                            0







                            You can either update metadata by adding something or updating a current metadata value with a new one, here is the piece of code I am using :



                            import sys
                            import os
                            import boto3
                            import pprint
                            from boto3 import client
                            from botocore.utils import fix_s3_host
                            param_1= YOUR_ACCESS_KEY
                            param_2= YOUR_SECRETE_KEY
                            param_3= YOUR_END_POINT
                            param_4= YOUR_BUCKET

                            #Create the S3 client
                            s3ressource = client(
                            service_name='s3',
                            endpoint_url= param_3,
                            aws_access_key_id= param_1,
                            aws_secret_access_key=param_2,
                            use_ssl=True,
                            )
                            # Building a list of of object per bucket
                            def BuildObjectListPerBucket (variablebucket):
                            global listofObjectstobeanalyzed
                            listofObjectstobeanalyzed =
                            extensions = ['.jpg','.png']
                            for key in s3ressource.list_objects(Bucket=variablebucket)["Contents"]:
                            #print (key ['Key'])
                            onemoreObject=key['Key']
                            if onemoreObject.endswith(tuple(extensions)):
                            listofObjectstobeanalyzed.append(onemoreObject)
                            #print listofObjectstobeanalyzed
                            else :
                            s3ressource.delete_object(Bucket=variablebucket,Key=onemoreObject)
                            return listofObjectstobeanalyzed

                            # for a given existing object, create metadata
                            def createmetdata(bucketname,objectname):
                            s3ressource.upload_file(objectname, bucketname, objectname, ExtraArgs={"Metadata": {"metadata1":"ImageName","metadata2":"ImagePROPERTIES" ,"metadata3":"ImageCREATIONDATE"}})

                            # for a given existing object, add new metadata
                            def ADDmetadata(bucketname,objectname):
                            s3_object = s3ressource.get_object(Bucket=bucketname, Key=objectname)
                            k = s3ressource.head_object(Bucket = bucketname, Key = objectname)
                            m = k["Metadata"]
                            m["new_metadata"] = "ImageNEWMETADATA"
                            s3ressource.copy_object(Bucket = bucketname, Key = objectname, CopySource = bucketname + '/' + objectname, Metadata = m, MetadataDirective='REPLACE')

                            # for a given existing object, update a metadata with new value
                            def CHANGEmetadata(bucketname,objectname):
                            s3_object = s3ressource.get_object(Bucket=bucketname, Key=objectname)
                            k = s3ressource.head_object(Bucket = bucketname, Key = objectname)
                            m = k["Metadata"]
                            m.update({'watson_visual_rec_dic':'ImageCREATIONDATEEEEEEEEEEEEEEEEEEEEEEEEEE'})
                            s3ressource.copy_object(Bucket = bucketname, Key = objectname, CopySource = bucketname + '/' + objectname, Metadata = m, MetadataDirective='REPLACE')

                            def readmetadata (bucketname,objectname):
                            ALLDATAOFOBJECT = s3ressource.get_object(Bucket=bucketname, Key=objectname)
                            ALLDATAOFOBJECTMETADATA=ALLDATAOFOBJECT['Metadata']
                            print ALLDATAOFOBJECTMETADATA



                            # create the list of object on a per bucket basis
                            BuildObjectListPerBucket (param_4)

                            # Call functions to see the results
                            for objectitem in listofObjectstobeanalyzed:
                            # CALL The function you want
                            readmetadata(param_4,objectitem)
                            ADDmetadata(param_4,objectitem)
                            readmetadata(param_4,objectitem)
                            CHANGEmetadata(param_4,objectitem)
                            readmetadata(param_4,objectitem)





                            share|improve this answer















                            You can either update metadata by adding something or updating a current metadata value with a new one, here is the piece of code I am using :



                            import sys
                            import os
                            import boto3
                            import pprint
                            from boto3 import client
                            from botocore.utils import fix_s3_host
                            param_1= YOUR_ACCESS_KEY
                            param_2= YOUR_SECRETE_KEY
                            param_3= YOUR_END_POINT
                            param_4= YOUR_BUCKET

                            #Create the S3 client
                            s3ressource = client(
                            service_name='s3',
                            endpoint_url= param_3,
                            aws_access_key_id= param_1,
                            aws_secret_access_key=param_2,
                            use_ssl=True,
                            )
                            # Building a list of of object per bucket
                            def BuildObjectListPerBucket (variablebucket):
                            global listofObjectstobeanalyzed
                            listofObjectstobeanalyzed =
                            extensions = ['.jpg','.png']
                            for key in s3ressource.list_objects(Bucket=variablebucket)["Contents"]:
                            #print (key ['Key'])
                            onemoreObject=key['Key']
                            if onemoreObject.endswith(tuple(extensions)):
                            listofObjectstobeanalyzed.append(onemoreObject)
                            #print listofObjectstobeanalyzed
                            else :
                            s3ressource.delete_object(Bucket=variablebucket,Key=onemoreObject)
                            return listofObjectstobeanalyzed

                            # for a given existing object, create metadata
                            def createmetdata(bucketname,objectname):
                            s3ressource.upload_file(objectname, bucketname, objectname, ExtraArgs={"Metadata": {"metadata1":"ImageName","metadata2":"ImagePROPERTIES" ,"metadata3":"ImageCREATIONDATE"}})

                            # for a given existing object, add new metadata
                            def ADDmetadata(bucketname,objectname):
                            s3_object = s3ressource.get_object(Bucket=bucketname, Key=objectname)
                            k = s3ressource.head_object(Bucket = bucketname, Key = objectname)
                            m = k["Metadata"]
                            m["new_metadata"] = "ImageNEWMETADATA"
                            s3ressource.copy_object(Bucket = bucketname, Key = objectname, CopySource = bucketname + '/' + objectname, Metadata = m, MetadataDirective='REPLACE')

                            # for a given existing object, update a metadata with new value
                            def CHANGEmetadata(bucketname,objectname):
                            s3_object = s3ressource.get_object(Bucket=bucketname, Key=objectname)
                            k = s3ressource.head_object(Bucket = bucketname, Key = objectname)
                            m = k["Metadata"]
                            m.update({'watson_visual_rec_dic':'ImageCREATIONDATEEEEEEEEEEEEEEEEEEEEEEEEEE'})
                            s3ressource.copy_object(Bucket = bucketname, Key = objectname, CopySource = bucketname + '/' + objectname, Metadata = m, MetadataDirective='REPLACE')

                            def readmetadata (bucketname,objectname):
                            ALLDATAOFOBJECT = s3ressource.get_object(Bucket=bucketname, Key=objectname)
                            ALLDATAOFOBJECTMETADATA=ALLDATAOFOBJECT['Metadata']
                            print ALLDATAOFOBJECTMETADATA



                            # create the list of object on a per bucket basis
                            BuildObjectListPerBucket (param_4)

                            # Call functions to see the results
                            for objectitem in listofObjectstobeanalyzed:
                            # CALL The function you want
                            readmetadata(param_4,objectitem)
                            ADDmetadata(param_4,objectitem)
                            readmetadata(param_4,objectitem)
                            CHANGEmetadata(param_4,objectitem)
                            readmetadata(param_4,objectitem)






                            share|improve this answer














                            share|improve this answer



                            share|improve this answer








                            edited Apr 24 '17 at 13:58

























                            answered Apr 13 '17 at 8:14









                            MouIdriMouIdri

                            371314




                            371314























                                0














                                You can do this using copy_from() on the resource (like this answer) mentions, but you can also use the client's copy_object() and specify the same source and destination. The methods are equivalent and invoke the same code underneath.



                                import boto3
                                s3 = boto3.client("s3")
                                src_key = "my-key"
                                src_bucket = "my-bucket"
                                s3.copy_object(Key=src_key, Bucket=src_bucket,
                                CopySource={"Bucket": src_bucket, "Key": src_key},
                                Metadata={"my_new_key": "my_new_val"},
                                MetadataDirective="REPLACE")


                                The 'REPLACE' means that the new metadata on the request will overwrite the source metadata entirely. If you mean to only add new key-values, you'd have to read the original first and update.



                                You can retrieve the original metadata with head_object(Key=src_key, Bucket=src_bucket). If that's the route you take, you should also pass CopySourceIfMatch=original_etag in the copy_object(...) request to preserve the same atomicity property as the single call version. If anything were to change the object between the read and the write, the copy_object would fail with a http 412 error.



                                Reference: boto3 issue 389






                                share|improve this answer






























                                  0














                                  You can do this using copy_from() on the resource (like this answer) mentions, but you can also use the client's copy_object() and specify the same source and destination. The methods are equivalent and invoke the same code underneath.



                                  import boto3
                                  s3 = boto3.client("s3")
                                  src_key = "my-key"
                                  src_bucket = "my-bucket"
                                  s3.copy_object(Key=src_key, Bucket=src_bucket,
                                  CopySource={"Bucket": src_bucket, "Key": src_key},
                                  Metadata={"my_new_key": "my_new_val"},
                                  MetadataDirective="REPLACE")


                                  The 'REPLACE' means that the new metadata on the request will overwrite the source metadata entirely. If you mean to only add new key-values, you'd have to read the original first and update.



                                  You can retrieve the original metadata with head_object(Key=src_key, Bucket=src_bucket). If that's the route you take, you should also pass CopySourceIfMatch=original_etag in the copy_object(...) request to preserve the same atomicity property as the single call version. If anything were to change the object between the read and the write, the copy_object would fail with a http 412 error.



                                  Reference: boto3 issue 389






                                  share|improve this answer




























                                    0












                                    0








                                    0







                                    You can do this using copy_from() on the resource (like this answer) mentions, but you can also use the client's copy_object() and specify the same source and destination. The methods are equivalent and invoke the same code underneath.



                                    import boto3
                                    s3 = boto3.client("s3")
                                    src_key = "my-key"
                                    src_bucket = "my-bucket"
                                    s3.copy_object(Key=src_key, Bucket=src_bucket,
                                    CopySource={"Bucket": src_bucket, "Key": src_key},
                                    Metadata={"my_new_key": "my_new_val"},
                                    MetadataDirective="REPLACE")


                                    The 'REPLACE' means that the new metadata on the request will overwrite the source metadata entirely. If you mean to only add new key-values, you'd have to read the original first and update.



                                    You can retrieve the original metadata with head_object(Key=src_key, Bucket=src_bucket). If that's the route you take, you should also pass CopySourceIfMatch=original_etag in the copy_object(...) request to preserve the same atomicity property as the single call version. If anything were to change the object between the read and the write, the copy_object would fail with a http 412 error.



                                    Reference: boto3 issue 389






                                    share|improve this answer















                                    You can do this using copy_from() on the resource (like this answer) mentions, but you can also use the client's copy_object() and specify the same source and destination. The methods are equivalent and invoke the same code underneath.



                                    import boto3
                                    s3 = boto3.client("s3")
                                    src_key = "my-key"
                                    src_bucket = "my-bucket"
                                    s3.copy_object(Key=src_key, Bucket=src_bucket,
                                    CopySource={"Bucket": src_bucket, "Key": src_key},
                                    Metadata={"my_new_key": "my_new_val"},
                                    MetadataDirective="REPLACE")


                                    The 'REPLACE' means that the new metadata on the request will overwrite the source metadata entirely. If you mean to only add new key-values, you'd have to read the original first and update.



                                    You can retrieve the original metadata with head_object(Key=src_key, Bucket=src_bucket). If that's the route you take, you should also pass CopySourceIfMatch=original_etag in the copy_object(...) request to preserve the same atomicity property as the single call version. If anything were to change the object between the read and the write, the copy_object would fail with a http 412 error.



                                    Reference: boto3 issue 389







                                    share|improve this answer














                                    share|improve this answer



                                    share|improve this answer








                                    edited Nov 21 '18 at 3:32

























                                    answered Nov 21 '18 at 3:27









                                    init_jsinit_js

                                    966824




                                    966824






























                                        draft saved

                                        draft discarded




















































                                        Thanks for contributing an answer to Stack Overflow!


                                        • Please be sure to answer the question. Provide details and share your research!

                                        But avoid



                                        • Asking for help, clarification, or responding to other answers.

                                        • Making statements based on opinion; back them up with references or personal experience.


                                        To learn more, see our tips on writing great answers.




                                        draft saved


                                        draft discarded














                                        StackExchange.ready(
                                        function () {
                                        StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstackoverflow.com%2fquestions%2f39596987%2fhow-to-update-metadata-of-an-existing-object-in-aws-s3-using-python-boto3%23new-answer', 'question_page');
                                        }
                                        );

                                        Post as a guest















                                        Required, but never shown





















































                                        Required, but never shown














                                        Required, but never shown












                                        Required, but never shown







                                        Required, but never shown

































                                        Required, but never shown














                                        Required, but never shown












                                        Required, but never shown







                                        Required, but never shown







                                        Popular posts from this blog

                                        MongoDB - Not Authorized To Execute Command

                                        How to fix TextFormField cause rebuild widget in Flutter

                                        Npm cannot find a required file even through it is in the searched directory