Too many files open
import PIL as Image
Image.fromarray(cv2.imread(link, cv2.IMREAD_GRAYSCALE))
I'm currently trying to complete a project but I'm constantly getting too many files open error on my linux GPU server which crashes the server.
I'm loading 3 images for CNN classification using the code as shown above. Anyone facing the same problem have a solution to this?
Thank you.
python-imaging-library pytorch
add a comment |
import PIL as Image
Image.fromarray(cv2.imread(link, cv2.IMREAD_GRAYSCALE))
I'm currently trying to complete a project but I'm constantly getting too many files open error on my linux GPU server which crashes the server.
I'm loading 3 images for CNN classification using the code as shown above. Anyone facing the same problem have a solution to this?
Thank you.
python-imaging-library pytorch
You should add more information. What is the size of your dataset? Batch size? The relevant error message should also be posted. You should also polish your writing to make it clear. See also how to ask good question
– jdhao
Jan 2 at 16:25
add a comment |
import PIL as Image
Image.fromarray(cv2.imread(link, cv2.IMREAD_GRAYSCALE))
I'm currently trying to complete a project but I'm constantly getting too many files open error on my linux GPU server which crashes the server.
I'm loading 3 images for CNN classification using the code as shown above. Anyone facing the same problem have a solution to this?
Thank you.
python-imaging-library pytorch
import PIL as Image
Image.fromarray(cv2.imread(link, cv2.IMREAD_GRAYSCALE))
I'm currently trying to complete a project but I'm constantly getting too many files open error on my linux GPU server which crashes the server.
I'm loading 3 images for CNN classification using the code as shown above. Anyone facing the same problem have a solution to this?
Thank you.
python-imaging-library pytorch
python-imaging-library pytorch
asked Jan 2 at 1:38
Qingyao HuQingyao Hu
11
11
You should add more information. What is the size of your dataset? Batch size? The relevant error message should also be posted. You should also polish your writing to make it clear. See also how to ask good question
– jdhao
Jan 2 at 16:25
add a comment |
You should add more information. What is the size of your dataset? Batch size? The relevant error message should also be posted. You should also polish your writing to make it clear. See also how to ask good question
– jdhao
Jan 2 at 16:25
You should add more information. What is the size of your dataset? Batch size? The relevant error message should also be posted. You should also polish your writing to make it clear. See also how to ask good question
– jdhao
Jan 2 at 16:25
You should add more information. What is the size of your dataset? Batch size? The relevant error message should also be posted. You should also polish your writing to make it clear. See also how to ask good question
– jdhao
Jan 2 at 16:25
add a comment |
1 Answer
1
active
oldest
votes
Try switching to the file strategy system by adding this to your script
import torch.multiprocessing
torch.multiprocessing.set_sharing_strategy('file_system')
Hi Silas, the administrator of the server has already increased the maximum file limit but it seems it might be some problem with how my image is loaded using the pytorch dataloader. I'm trying to see if anyone is facing the same problem as me.
– Qingyao Hu
Jan 2 at 2:06
the default file_descriptor share strategy uses file descriptors as shared memory handles, and this will hit the limit when there are too many batches at DataLoader and this case it's the images you're using
– Silas Jojo
Jan 2 at 3:07
pytorch.org/docs/master/…
– Silas Jojo
Jan 2 at 3:13
add a comment |
Your Answer
StackExchange.ifUsing("editor", function () {
StackExchange.using("externalEditor", function () {
StackExchange.using("snippets", function () {
StackExchange.snippets.init();
});
});
}, "code-snippets");
StackExchange.ready(function() {
var channelOptions = {
tags: "".split(" "),
id: "1"
};
initTagRenderer("".split(" "), "".split(" "), channelOptions);
StackExchange.using("externalEditor", function() {
// Have to fire editor after snippets, if snippets enabled
if (StackExchange.settings.snippets.snippetsEnabled) {
StackExchange.using("snippets", function() {
createEditor();
});
}
else {
createEditor();
}
});
function createEditor() {
StackExchange.prepareEditor({
heartbeatType: 'answer',
autoActivateHeartbeat: false,
convertImagesToLinks: true,
noModals: true,
showLowRepImageUploadWarning: true,
reputationToPostImages: 10,
bindNavPrevention: true,
postfix: "",
imageUploader: {
brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
allowUrls: true
},
onDemand: true,
discardSelector: ".discard-answer"
,immediatelyShowMarkdownHelp:true
});
}
});
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstackoverflow.com%2fquestions%2f54000317%2ftoo-many-files-open%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
1 Answer
1
active
oldest
votes
1 Answer
1
active
oldest
votes
active
oldest
votes
active
oldest
votes
Try switching to the file strategy system by adding this to your script
import torch.multiprocessing
torch.multiprocessing.set_sharing_strategy('file_system')
Hi Silas, the administrator of the server has already increased the maximum file limit but it seems it might be some problem with how my image is loaded using the pytorch dataloader. I'm trying to see if anyone is facing the same problem as me.
– Qingyao Hu
Jan 2 at 2:06
the default file_descriptor share strategy uses file descriptors as shared memory handles, and this will hit the limit when there are too many batches at DataLoader and this case it's the images you're using
– Silas Jojo
Jan 2 at 3:07
pytorch.org/docs/master/…
– Silas Jojo
Jan 2 at 3:13
add a comment |
Try switching to the file strategy system by adding this to your script
import torch.multiprocessing
torch.multiprocessing.set_sharing_strategy('file_system')
Hi Silas, the administrator of the server has already increased the maximum file limit but it seems it might be some problem with how my image is loaded using the pytorch dataloader. I'm trying to see if anyone is facing the same problem as me.
– Qingyao Hu
Jan 2 at 2:06
the default file_descriptor share strategy uses file descriptors as shared memory handles, and this will hit the limit when there are too many batches at DataLoader and this case it's the images you're using
– Silas Jojo
Jan 2 at 3:07
pytorch.org/docs/master/…
– Silas Jojo
Jan 2 at 3:13
add a comment |
Try switching to the file strategy system by adding this to your script
import torch.multiprocessing
torch.multiprocessing.set_sharing_strategy('file_system')
Try switching to the file strategy system by adding this to your script
import torch.multiprocessing
torch.multiprocessing.set_sharing_strategy('file_system')
edited Jan 2 at 3:11
answered Jan 2 at 1:54


Silas JojoSilas Jojo
12
12
Hi Silas, the administrator of the server has already increased the maximum file limit but it seems it might be some problem with how my image is loaded using the pytorch dataloader. I'm trying to see if anyone is facing the same problem as me.
– Qingyao Hu
Jan 2 at 2:06
the default file_descriptor share strategy uses file descriptors as shared memory handles, and this will hit the limit when there are too many batches at DataLoader and this case it's the images you're using
– Silas Jojo
Jan 2 at 3:07
pytorch.org/docs/master/…
– Silas Jojo
Jan 2 at 3:13
add a comment |
Hi Silas, the administrator of the server has already increased the maximum file limit but it seems it might be some problem with how my image is loaded using the pytorch dataloader. I'm trying to see if anyone is facing the same problem as me.
– Qingyao Hu
Jan 2 at 2:06
the default file_descriptor share strategy uses file descriptors as shared memory handles, and this will hit the limit when there are too many batches at DataLoader and this case it's the images you're using
– Silas Jojo
Jan 2 at 3:07
pytorch.org/docs/master/…
– Silas Jojo
Jan 2 at 3:13
Hi Silas, the administrator of the server has already increased the maximum file limit but it seems it might be some problem with how my image is loaded using the pytorch dataloader. I'm trying to see if anyone is facing the same problem as me.
– Qingyao Hu
Jan 2 at 2:06
Hi Silas, the administrator of the server has already increased the maximum file limit but it seems it might be some problem with how my image is loaded using the pytorch dataloader. I'm trying to see if anyone is facing the same problem as me.
– Qingyao Hu
Jan 2 at 2:06
the default file_descriptor share strategy uses file descriptors as shared memory handles, and this will hit the limit when there are too many batches at DataLoader and this case it's the images you're using
– Silas Jojo
Jan 2 at 3:07
the default file_descriptor share strategy uses file descriptors as shared memory handles, and this will hit the limit when there are too many batches at DataLoader and this case it's the images you're using
– Silas Jojo
Jan 2 at 3:07
pytorch.org/docs/master/…
– Silas Jojo
Jan 2 at 3:13
pytorch.org/docs/master/…
– Silas Jojo
Jan 2 at 3:13
add a comment |
Thanks for contributing an answer to Stack Overflow!
- Please be sure to answer the question. Provide details and share your research!
But avoid …
- Asking for help, clarification, or responding to other answers.
- Making statements based on opinion; back them up with references or personal experience.
To learn more, see our tips on writing great answers.
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstackoverflow.com%2fquestions%2f54000317%2ftoo-many-files-open%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
You should add more information. What is the size of your dataset? Batch size? The relevant error message should also be posted. You should also polish your writing to make it clear. See also how to ask good question
– jdhao
Jan 2 at 16:25