AWS Redshift: How to store text field with size greater than 100K





.everyoneloves__top-leaderboard:empty,.everyoneloves__mid-leaderboard:empty,.everyoneloves__bot-mid-leaderboard:empty{ height:90px;width:728px;box-sizing:border-box;
}







0















I have a text field in parquet file with max length 141598. I am loading the parquet file to redshift and got the error while loading as the max a varchar can store is 65535.
Is there any other datatype I can use or another alternative to follow?



Error while loading:



S3 Query Exception (Fetch). Task failed due to an internal error. The length of the data column friends is longer than the length defined in the table. Table: 65535, Data: 141598










share|improve this question























  • no, that is the max.

    – Jon Scott
    Jan 3 at 10:26


















0















I have a text field in parquet file with max length 141598. I am loading the parquet file to redshift and got the error while loading as the max a varchar can store is 65535.
Is there any other datatype I can use or another alternative to follow?



Error while loading:



S3 Query Exception (Fetch). Task failed due to an internal error. The length of the data column friends is longer than the length defined in the table. Table: 65535, Data: 141598










share|improve this question























  • no, that is the max.

    – Jon Scott
    Jan 3 at 10:26














0












0








0








I have a text field in parquet file with max length 141598. I am loading the parquet file to redshift and got the error while loading as the max a varchar can store is 65535.
Is there any other datatype I can use or another alternative to follow?



Error while loading:



S3 Query Exception (Fetch). Task failed due to an internal error. The length of the data column friends is longer than the length defined in the table. Table: 65535, Data: 141598










share|improve this question














I have a text field in parquet file with max length 141598. I am loading the parquet file to redshift and got the error while loading as the max a varchar can store is 65535.
Is there any other datatype I can use or another alternative to follow?



Error while loading:



S3 Query Exception (Fetch). Task failed due to an internal error. The length of the data column friends is longer than the length defined in the table. Table: 65535, Data: 141598







amazon-web-services amazon-redshift






share|improve this question













share|improve this question











share|improve this question




share|improve this question










asked Jan 3 at 3:33









Sanchit KumarSanchit Kumar

328111




328111













  • no, that is the max.

    – Jon Scott
    Jan 3 at 10:26



















  • no, that is the max.

    – Jon Scott
    Jan 3 at 10:26

















no, that is the max.

– Jon Scott
Jan 3 at 10:26





no, that is the max.

– Jon Scott
Jan 3 at 10:26












1 Answer
1






active

oldest

votes


















0














No, the maximum length of a VARCHAR data type is 65535 bytes and that is the longest data type that Redshift is capable of storing. Note that length is in bytes, not characters, so the actual number of characters stored depends on their byte length.



If the data is already in parquet format then possibly you don't need to load this data into a Redshift table at all, instead you could create a Spectrum external table over it. The external table definition will only support a VARCHAR definition of 65535, the same as a normal table, and any query against the column will silently truncate additional characters beyond that length - however the original data will be preserved in the parquet file and potentially accessible by other means if needed.






share|improve this answer
























  • Spectrum seems a good idea. Gonna give it a try.

    – Sanchit Kumar
    Jan 4 at 0:45












Your Answer






StackExchange.ifUsing("editor", function () {
StackExchange.using("externalEditor", function () {
StackExchange.using("snippets", function () {
StackExchange.snippets.init();
});
});
}, "code-snippets");

StackExchange.ready(function() {
var channelOptions = {
tags: "".split(" "),
id: "1"
};
initTagRenderer("".split(" "), "".split(" "), channelOptions);

StackExchange.using("externalEditor", function() {
// Have to fire editor after snippets, if snippets enabled
if (StackExchange.settings.snippets.snippetsEnabled) {
StackExchange.using("snippets", function() {
createEditor();
});
}
else {
createEditor();
}
});

function createEditor() {
StackExchange.prepareEditor({
heartbeatType: 'answer',
autoActivateHeartbeat: false,
convertImagesToLinks: true,
noModals: true,
showLowRepImageUploadWarning: true,
reputationToPostImages: 10,
bindNavPrevention: true,
postfix: "",
imageUploader: {
brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
allowUrls: true
},
onDemand: true,
discardSelector: ".discard-answer"
,immediatelyShowMarkdownHelp:true
});


}
});














draft saved

draft discarded


















StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstackoverflow.com%2fquestions%2f54015992%2faws-redshift-how-to-store-text-field-with-size-greater-than-100k%23new-answer', 'question_page');
}
);

Post as a guest















Required, but never shown

























1 Answer
1






active

oldest

votes








1 Answer
1






active

oldest

votes









active

oldest

votes






active

oldest

votes









0














No, the maximum length of a VARCHAR data type is 65535 bytes and that is the longest data type that Redshift is capable of storing. Note that length is in bytes, not characters, so the actual number of characters stored depends on their byte length.



If the data is already in parquet format then possibly you don't need to load this data into a Redshift table at all, instead you could create a Spectrum external table over it. The external table definition will only support a VARCHAR definition of 65535, the same as a normal table, and any query against the column will silently truncate additional characters beyond that length - however the original data will be preserved in the parquet file and potentially accessible by other means if needed.






share|improve this answer
























  • Spectrum seems a good idea. Gonna give it a try.

    – Sanchit Kumar
    Jan 4 at 0:45
















0














No, the maximum length of a VARCHAR data type is 65535 bytes and that is the longest data type that Redshift is capable of storing. Note that length is in bytes, not characters, so the actual number of characters stored depends on their byte length.



If the data is already in parquet format then possibly you don't need to load this data into a Redshift table at all, instead you could create a Spectrum external table over it. The external table definition will only support a VARCHAR definition of 65535, the same as a normal table, and any query against the column will silently truncate additional characters beyond that length - however the original data will be preserved in the parquet file and potentially accessible by other means if needed.






share|improve this answer
























  • Spectrum seems a good idea. Gonna give it a try.

    – Sanchit Kumar
    Jan 4 at 0:45














0












0








0







No, the maximum length of a VARCHAR data type is 65535 bytes and that is the longest data type that Redshift is capable of storing. Note that length is in bytes, not characters, so the actual number of characters stored depends on their byte length.



If the data is already in parquet format then possibly you don't need to load this data into a Redshift table at all, instead you could create a Spectrum external table over it. The external table definition will only support a VARCHAR definition of 65535, the same as a normal table, and any query against the column will silently truncate additional characters beyond that length - however the original data will be preserved in the parquet file and potentially accessible by other means if needed.






share|improve this answer













No, the maximum length of a VARCHAR data type is 65535 bytes and that is the longest data type that Redshift is capable of storing. Note that length is in bytes, not characters, so the actual number of characters stored depends on their byte length.



If the data is already in parquet format then possibly you don't need to load this data into a Redshift table at all, instead you could create a Spectrum external table over it. The external table definition will only support a VARCHAR definition of 65535, the same as a normal table, and any query against the column will silently truncate additional characters beyond that length - however the original data will be preserved in the parquet file and potentially accessible by other means if needed.







share|improve this answer












share|improve this answer



share|improve this answer










answered Jan 3 at 19:18









Nathan GriffithsNathan Griffiths

8,79022239




8,79022239













  • Spectrum seems a good idea. Gonna give it a try.

    – Sanchit Kumar
    Jan 4 at 0:45



















  • Spectrum seems a good idea. Gonna give it a try.

    – Sanchit Kumar
    Jan 4 at 0:45

















Spectrum seems a good idea. Gonna give it a try.

– Sanchit Kumar
Jan 4 at 0:45





Spectrum seems a good idea. Gonna give it a try.

– Sanchit Kumar
Jan 4 at 0:45




















draft saved

draft discarded




















































Thanks for contributing an answer to Stack Overflow!


  • Please be sure to answer the question. Provide details and share your research!

But avoid



  • Asking for help, clarification, or responding to other answers.

  • Making statements based on opinion; back them up with references or personal experience.


To learn more, see our tips on writing great answers.




draft saved


draft discarded














StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstackoverflow.com%2fquestions%2f54015992%2faws-redshift-how-to-store-text-field-with-size-greater-than-100k%23new-answer', 'question_page');
}
);

Post as a guest















Required, but never shown





















































Required, but never shown














Required, but never shown












Required, but never shown







Required, but never shown

































Required, but never shown














Required, but never shown












Required, but never shown







Required, but never shown







Popular posts from this blog

MongoDB - Not Authorized To Execute Command

How to fix TextFormField cause rebuild widget in Flutter

in spring boot 2.1 many test slices are not allowed anymore due to multiple @BootstrapWith