Too many files open












-1















import PIL as Image
Image.fromarray(cv2.imread(link, cv2.IMREAD_GRAYSCALE))


I'm currently trying to complete a project but I'm constantly getting too many files open error on my linux GPU server which crashes the server.



I'm loading 3 images for CNN classification using the code as shown above. Anyone facing the same problem have a solution to this?



Thank you.










share|improve this question























  • You should add more information. What is the size of your dataset? Batch size? The relevant error message should also be posted. You should also polish your writing to make it clear. See also how to ask good question

    – jdhao
    Jan 2 at 16:25
















-1















import PIL as Image
Image.fromarray(cv2.imread(link, cv2.IMREAD_GRAYSCALE))


I'm currently trying to complete a project but I'm constantly getting too many files open error on my linux GPU server which crashes the server.



I'm loading 3 images for CNN classification using the code as shown above. Anyone facing the same problem have a solution to this?



Thank you.










share|improve this question























  • You should add more information. What is the size of your dataset? Batch size? The relevant error message should also be posted. You should also polish your writing to make it clear. See also how to ask good question

    – jdhao
    Jan 2 at 16:25














-1












-1








-1


1






import PIL as Image
Image.fromarray(cv2.imread(link, cv2.IMREAD_GRAYSCALE))


I'm currently trying to complete a project but I'm constantly getting too many files open error on my linux GPU server which crashes the server.



I'm loading 3 images for CNN classification using the code as shown above. Anyone facing the same problem have a solution to this?



Thank you.










share|improve this question














import PIL as Image
Image.fromarray(cv2.imread(link, cv2.IMREAD_GRAYSCALE))


I'm currently trying to complete a project but I'm constantly getting too many files open error on my linux GPU server which crashes the server.



I'm loading 3 images for CNN classification using the code as shown above. Anyone facing the same problem have a solution to this?



Thank you.







python-imaging-library pytorch






share|improve this question













share|improve this question











share|improve this question




share|improve this question










asked Jan 2 at 1:38









Qingyao HuQingyao Hu

11




11













  • You should add more information. What is the size of your dataset? Batch size? The relevant error message should also be posted. You should also polish your writing to make it clear. See also how to ask good question

    – jdhao
    Jan 2 at 16:25



















  • You should add more information. What is the size of your dataset? Batch size? The relevant error message should also be posted. You should also polish your writing to make it clear. See also how to ask good question

    – jdhao
    Jan 2 at 16:25

















You should add more information. What is the size of your dataset? Batch size? The relevant error message should also be posted. You should also polish your writing to make it clear. See also how to ask good question

– jdhao
Jan 2 at 16:25





You should add more information. What is the size of your dataset? Batch size? The relevant error message should also be posted. You should also polish your writing to make it clear. See also how to ask good question

– jdhao
Jan 2 at 16:25












1 Answer
1






active

oldest

votes


















0














Try switching to the file strategy system by adding this to your script



import torch.multiprocessing
torch.multiprocessing.set_sharing_strategy('file_system')





share|improve this answer


























  • Hi Silas, the administrator of the server has already increased the maximum file limit but it seems it might be some problem with how my image is loaded using the pytorch dataloader. I'm trying to see if anyone is facing the same problem as me.

    – Qingyao Hu
    Jan 2 at 2:06











  • the default file_descriptor share strategy uses file descriptors as shared memory handles, and this will hit the limit when there are too many batches at DataLoader and this case it's the images you're using

    – Silas Jojo
    Jan 2 at 3:07











  • pytorch.org/docs/master/…

    – Silas Jojo
    Jan 2 at 3:13











Your Answer






StackExchange.ifUsing("editor", function () {
StackExchange.using("externalEditor", function () {
StackExchange.using("snippets", function () {
StackExchange.snippets.init();
});
});
}, "code-snippets");

StackExchange.ready(function() {
var channelOptions = {
tags: "".split(" "),
id: "1"
};
initTagRenderer("".split(" "), "".split(" "), channelOptions);

StackExchange.using("externalEditor", function() {
// Have to fire editor after snippets, if snippets enabled
if (StackExchange.settings.snippets.snippetsEnabled) {
StackExchange.using("snippets", function() {
createEditor();
});
}
else {
createEditor();
}
});

function createEditor() {
StackExchange.prepareEditor({
heartbeatType: 'answer',
autoActivateHeartbeat: false,
convertImagesToLinks: true,
noModals: true,
showLowRepImageUploadWarning: true,
reputationToPostImages: 10,
bindNavPrevention: true,
postfix: "",
imageUploader: {
brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
allowUrls: true
},
onDemand: true,
discardSelector: ".discard-answer"
,immediatelyShowMarkdownHelp:true
});


}
});














draft saved

draft discarded


















StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstackoverflow.com%2fquestions%2f54000317%2ftoo-many-files-open%23new-answer', 'question_page');
}
);

Post as a guest















Required, but never shown

























1 Answer
1






active

oldest

votes








1 Answer
1






active

oldest

votes









active

oldest

votes






active

oldest

votes









0














Try switching to the file strategy system by adding this to your script



import torch.multiprocessing
torch.multiprocessing.set_sharing_strategy('file_system')





share|improve this answer


























  • Hi Silas, the administrator of the server has already increased the maximum file limit but it seems it might be some problem with how my image is loaded using the pytorch dataloader. I'm trying to see if anyone is facing the same problem as me.

    – Qingyao Hu
    Jan 2 at 2:06











  • the default file_descriptor share strategy uses file descriptors as shared memory handles, and this will hit the limit when there are too many batches at DataLoader and this case it's the images you're using

    – Silas Jojo
    Jan 2 at 3:07











  • pytorch.org/docs/master/…

    – Silas Jojo
    Jan 2 at 3:13
















0














Try switching to the file strategy system by adding this to your script



import torch.multiprocessing
torch.multiprocessing.set_sharing_strategy('file_system')





share|improve this answer


























  • Hi Silas, the administrator of the server has already increased the maximum file limit but it seems it might be some problem with how my image is loaded using the pytorch dataloader. I'm trying to see if anyone is facing the same problem as me.

    – Qingyao Hu
    Jan 2 at 2:06











  • the default file_descriptor share strategy uses file descriptors as shared memory handles, and this will hit the limit when there are too many batches at DataLoader and this case it's the images you're using

    – Silas Jojo
    Jan 2 at 3:07











  • pytorch.org/docs/master/…

    – Silas Jojo
    Jan 2 at 3:13














0












0








0







Try switching to the file strategy system by adding this to your script



import torch.multiprocessing
torch.multiprocessing.set_sharing_strategy('file_system')





share|improve this answer















Try switching to the file strategy system by adding this to your script



import torch.multiprocessing
torch.multiprocessing.set_sharing_strategy('file_system')






share|improve this answer














share|improve this answer



share|improve this answer








edited Jan 2 at 3:11

























answered Jan 2 at 1:54









Silas JojoSilas Jojo

12




12













  • Hi Silas, the administrator of the server has already increased the maximum file limit but it seems it might be some problem with how my image is loaded using the pytorch dataloader. I'm trying to see if anyone is facing the same problem as me.

    – Qingyao Hu
    Jan 2 at 2:06











  • the default file_descriptor share strategy uses file descriptors as shared memory handles, and this will hit the limit when there are too many batches at DataLoader and this case it's the images you're using

    – Silas Jojo
    Jan 2 at 3:07











  • pytorch.org/docs/master/…

    – Silas Jojo
    Jan 2 at 3:13



















  • Hi Silas, the administrator of the server has already increased the maximum file limit but it seems it might be some problem with how my image is loaded using the pytorch dataloader. I'm trying to see if anyone is facing the same problem as me.

    – Qingyao Hu
    Jan 2 at 2:06











  • the default file_descriptor share strategy uses file descriptors as shared memory handles, and this will hit the limit when there are too many batches at DataLoader and this case it's the images you're using

    – Silas Jojo
    Jan 2 at 3:07











  • pytorch.org/docs/master/…

    – Silas Jojo
    Jan 2 at 3:13

















Hi Silas, the administrator of the server has already increased the maximum file limit but it seems it might be some problem with how my image is loaded using the pytorch dataloader. I'm trying to see if anyone is facing the same problem as me.

– Qingyao Hu
Jan 2 at 2:06





Hi Silas, the administrator of the server has already increased the maximum file limit but it seems it might be some problem with how my image is loaded using the pytorch dataloader. I'm trying to see if anyone is facing the same problem as me.

– Qingyao Hu
Jan 2 at 2:06













the default file_descriptor share strategy uses file descriptors as shared memory handles, and this will hit the limit when there are too many batches at DataLoader and this case it's the images you're using

– Silas Jojo
Jan 2 at 3:07





the default file_descriptor share strategy uses file descriptors as shared memory handles, and this will hit the limit when there are too many batches at DataLoader and this case it's the images you're using

– Silas Jojo
Jan 2 at 3:07













pytorch.org/docs/master/…

– Silas Jojo
Jan 2 at 3:13





pytorch.org/docs/master/…

– Silas Jojo
Jan 2 at 3:13




















draft saved

draft discarded




















































Thanks for contributing an answer to Stack Overflow!


  • Please be sure to answer the question. Provide details and share your research!

But avoid



  • Asking for help, clarification, or responding to other answers.

  • Making statements based on opinion; back them up with references or personal experience.


To learn more, see our tips on writing great answers.




draft saved


draft discarded














StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstackoverflow.com%2fquestions%2f54000317%2ftoo-many-files-open%23new-answer', 'question_page');
}
);

Post as a guest















Required, but never shown





















































Required, but never shown














Required, but never shown












Required, but never shown







Required, but never shown

































Required, but never shown














Required, but never shown












Required, but never shown







Required, but never shown







Popular posts from this blog

MongoDB - Not Authorized To Execute Command

in spring boot 2.1 many test slices are not allowed anymore due to multiple @BootstrapWith

How to fix TextFormField cause rebuild widget in Flutter