Beam Dataflow only writes to temp in GCS












0














I have a very basic Python Dataflow job that reads some data from Pub/Sub, applies a FixedWindow and writes to Google Cloud Storage.



transformed = ...
transformed | beam.io.WriteToText(known_args.output)


The output is written to the location specific in --output, but only the temporary stage, i.e.



gs://MY_BUCKET/MY_DIR/beam-temp-2a5c0e1eec1c11e8b98342010a800004/...some_UUID...


The file never gets placed into the correctly named location with the sharding template.



Tested on local and DataFlow runner.





When testing further, I have noticed that the streaming_wordcount example has the same issues, however the standard wordcount example writes fine. Perhaps the issues is to with windowing, or reading from pubsub?





It appears WriteToText is not compatible with the streaming source of PubSub. There are likely workarounds, or the Java version may be compatible, but I have opted to use a different solution altogether.










share|improve this question
























  • can you please post the code?
    – Tanveer Uddin
    Nov 19 '18 at 21:41
















0














I have a very basic Python Dataflow job that reads some data from Pub/Sub, applies a FixedWindow and writes to Google Cloud Storage.



transformed = ...
transformed | beam.io.WriteToText(known_args.output)


The output is written to the location specific in --output, but only the temporary stage, i.e.



gs://MY_BUCKET/MY_DIR/beam-temp-2a5c0e1eec1c11e8b98342010a800004/...some_UUID...


The file never gets placed into the correctly named location with the sharding template.



Tested on local and DataFlow runner.





When testing further, I have noticed that the streaming_wordcount example has the same issues, however the standard wordcount example writes fine. Perhaps the issues is to with windowing, or reading from pubsub?





It appears WriteToText is not compatible with the streaming source of PubSub. There are likely workarounds, or the Java version may be compatible, but I have opted to use a different solution altogether.










share|improve this question
























  • can you please post the code?
    – Tanveer Uddin
    Nov 19 '18 at 21:41














0












0








0







I have a very basic Python Dataflow job that reads some data from Pub/Sub, applies a FixedWindow and writes to Google Cloud Storage.



transformed = ...
transformed | beam.io.WriteToText(known_args.output)


The output is written to the location specific in --output, but only the temporary stage, i.e.



gs://MY_BUCKET/MY_DIR/beam-temp-2a5c0e1eec1c11e8b98342010a800004/...some_UUID...


The file never gets placed into the correctly named location with the sharding template.



Tested on local and DataFlow runner.





When testing further, I have noticed that the streaming_wordcount example has the same issues, however the standard wordcount example writes fine. Perhaps the issues is to with windowing, or reading from pubsub?





It appears WriteToText is not compatible with the streaming source of PubSub. There are likely workarounds, or the Java version may be compatible, but I have opted to use a different solution altogether.










share|improve this question















I have a very basic Python Dataflow job that reads some data from Pub/Sub, applies a FixedWindow and writes to Google Cloud Storage.



transformed = ...
transformed | beam.io.WriteToText(known_args.output)


The output is written to the location specific in --output, but only the temporary stage, i.e.



gs://MY_BUCKET/MY_DIR/beam-temp-2a5c0e1eec1c11e8b98342010a800004/...some_UUID...


The file never gets placed into the correctly named location with the sharding template.



Tested on local and DataFlow runner.





When testing further, I have noticed that the streaming_wordcount example has the same issues, however the standard wordcount example writes fine. Perhaps the issues is to with windowing, or reading from pubsub?





It appears WriteToText is not compatible with the streaming source of PubSub. There are likely workarounds, or the Java version may be compatible, but I have opted to use a different solution altogether.







google-cloud-storage google-cloud-dataflow apache-beam google-cloud-pubsub






share|improve this question















share|improve this question













share|improve this question




share|improve this question








edited Nov 20 '18 at 10:41

























asked Nov 19 '18 at 17:10









Daniel Messias

6651819




6651819












  • can you please post the code?
    – Tanveer Uddin
    Nov 19 '18 at 21:41


















  • can you please post the code?
    – Tanveer Uddin
    Nov 19 '18 at 21:41
















can you please post the code?
– Tanveer Uddin
Nov 19 '18 at 21:41




can you please post the code?
– Tanveer Uddin
Nov 19 '18 at 21:41












1 Answer
1






active

oldest

votes


















0














Python streaming pipeline execution is experimentally available (with some limitations).



Unsupported features apply to all runners.
State and Timers APIs,
Custom source API,
Splittable DoFn API,
Handling of late data,
User-defined custom WindowFn.



Additionally, DataflowRunner does not currently support the following Cloud Dataflow specific features with Python streaming execution.



Streaming autoscaling
Updating existing pipelines
Cloud Dataflow Templates
Some monitoring features, such as msec counters, display data, metrics, and element counts for transforms. However, logging, watermarks, and element counts for sources are supported.



https://beam.apache.org/documentation/sdks/python-streaming/



As you're using FixedWindowFn and the pipeline was able to writing the data into tmp location, please recheck the output location --output gs://<your-gcs-bucket>/<you-gcs-folder>/<your-gcs-output-filename>






share|improve this answer





















    Your Answer






    StackExchange.ifUsing("editor", function () {
    StackExchange.using("externalEditor", function () {
    StackExchange.using("snippets", function () {
    StackExchange.snippets.init();
    });
    });
    }, "code-snippets");

    StackExchange.ready(function() {
    var channelOptions = {
    tags: "".split(" "),
    id: "1"
    };
    initTagRenderer("".split(" "), "".split(" "), channelOptions);

    StackExchange.using("externalEditor", function() {
    // Have to fire editor after snippets, if snippets enabled
    if (StackExchange.settings.snippets.snippetsEnabled) {
    StackExchange.using("snippets", function() {
    createEditor();
    });
    }
    else {
    createEditor();
    }
    });

    function createEditor() {
    StackExchange.prepareEditor({
    heartbeatType: 'answer',
    autoActivateHeartbeat: false,
    convertImagesToLinks: true,
    noModals: true,
    showLowRepImageUploadWarning: true,
    reputationToPostImages: 10,
    bindNavPrevention: true,
    postfix: "",
    imageUploader: {
    brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
    contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
    allowUrls: true
    },
    onDemand: true,
    discardSelector: ".discard-answer"
    ,immediatelyShowMarkdownHelp:true
    });


    }
    });














    draft saved

    draft discarded


















    StackExchange.ready(
    function () {
    StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstackoverflow.com%2fquestions%2f53379585%2fbeam-dataflow-only-writes-to-temp-in-gcs%23new-answer', 'question_page');
    }
    );

    Post as a guest















    Required, but never shown

























    1 Answer
    1






    active

    oldest

    votes








    1 Answer
    1






    active

    oldest

    votes









    active

    oldest

    votes






    active

    oldest

    votes









    0














    Python streaming pipeline execution is experimentally available (with some limitations).



    Unsupported features apply to all runners.
    State and Timers APIs,
    Custom source API,
    Splittable DoFn API,
    Handling of late data,
    User-defined custom WindowFn.



    Additionally, DataflowRunner does not currently support the following Cloud Dataflow specific features with Python streaming execution.



    Streaming autoscaling
    Updating existing pipelines
    Cloud Dataflow Templates
    Some monitoring features, such as msec counters, display data, metrics, and element counts for transforms. However, logging, watermarks, and element counts for sources are supported.



    https://beam.apache.org/documentation/sdks/python-streaming/



    As you're using FixedWindowFn and the pipeline was able to writing the data into tmp location, please recheck the output location --output gs://<your-gcs-bucket>/<you-gcs-folder>/<your-gcs-output-filename>






    share|improve this answer


























      0














      Python streaming pipeline execution is experimentally available (with some limitations).



      Unsupported features apply to all runners.
      State and Timers APIs,
      Custom source API,
      Splittable DoFn API,
      Handling of late data,
      User-defined custom WindowFn.



      Additionally, DataflowRunner does not currently support the following Cloud Dataflow specific features with Python streaming execution.



      Streaming autoscaling
      Updating existing pipelines
      Cloud Dataflow Templates
      Some monitoring features, such as msec counters, display data, metrics, and element counts for transforms. However, logging, watermarks, and element counts for sources are supported.



      https://beam.apache.org/documentation/sdks/python-streaming/



      As you're using FixedWindowFn and the pipeline was able to writing the data into tmp location, please recheck the output location --output gs://<your-gcs-bucket>/<you-gcs-folder>/<your-gcs-output-filename>






      share|improve this answer
























        0












        0








        0






        Python streaming pipeline execution is experimentally available (with some limitations).



        Unsupported features apply to all runners.
        State and Timers APIs,
        Custom source API,
        Splittable DoFn API,
        Handling of late data,
        User-defined custom WindowFn.



        Additionally, DataflowRunner does not currently support the following Cloud Dataflow specific features with Python streaming execution.



        Streaming autoscaling
        Updating existing pipelines
        Cloud Dataflow Templates
        Some monitoring features, such as msec counters, display data, metrics, and element counts for transforms. However, logging, watermarks, and element counts for sources are supported.



        https://beam.apache.org/documentation/sdks/python-streaming/



        As you're using FixedWindowFn and the pipeline was able to writing the data into tmp location, please recheck the output location --output gs://<your-gcs-bucket>/<you-gcs-folder>/<your-gcs-output-filename>






        share|improve this answer












        Python streaming pipeline execution is experimentally available (with some limitations).



        Unsupported features apply to all runners.
        State and Timers APIs,
        Custom source API,
        Splittable DoFn API,
        Handling of late data,
        User-defined custom WindowFn.



        Additionally, DataflowRunner does not currently support the following Cloud Dataflow specific features with Python streaming execution.



        Streaming autoscaling
        Updating existing pipelines
        Cloud Dataflow Templates
        Some monitoring features, such as msec counters, display data, metrics, and element counts for transforms. However, logging, watermarks, and element counts for sources are supported.



        https://beam.apache.org/documentation/sdks/python-streaming/



        As you're using FixedWindowFn and the pipeline was able to writing the data into tmp location, please recheck the output location --output gs://<your-gcs-bucket>/<you-gcs-folder>/<your-gcs-output-filename>







        share|improve this answer












        share|improve this answer



        share|improve this answer










        answered Dec 6 '18 at 21:42









        Satish Athukuri

        113




        113






























            draft saved

            draft discarded




















































            Thanks for contributing an answer to Stack Overflow!


            • Please be sure to answer the question. Provide details and share your research!

            But avoid



            • Asking for help, clarification, or responding to other answers.

            • Making statements based on opinion; back them up with references or personal experience.


            To learn more, see our tips on writing great answers.





            Some of your past answers have not been well-received, and you're in danger of being blocked from answering.


            Please pay close attention to the following guidance:


            • Please be sure to answer the question. Provide details and share your research!

            But avoid



            • Asking for help, clarification, or responding to other answers.

            • Making statements based on opinion; back them up with references or personal experience.


            To learn more, see our tips on writing great answers.




            draft saved


            draft discarded














            StackExchange.ready(
            function () {
            StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstackoverflow.com%2fquestions%2f53379585%2fbeam-dataflow-only-writes-to-temp-in-gcs%23new-answer', 'question_page');
            }
            );

            Post as a guest















            Required, but never shown





















































            Required, but never shown














            Required, but never shown












            Required, but never shown







            Required, but never shown

































            Required, but never shown














            Required, but never shown












            Required, but never shown







            Required, but never shown







            Popular posts from this blog

            Can a sorcerer learn a 5th-level spell early by creating spell slots using the Font of Magic feature?

            ts Property 'filter' does not exist on type '{}'

            mat-slide-toggle shouldn't change it's state when I click cancel in confirmation window