Does an Application memory leak cause an Operating

2020-05-30 02:17发布

问题:

When we say a program leaks memory, say a new without a delete in c++, does it really leak? I mean, when the program ends, is that memory still allocated to some non-running program and can't be used, or does the OS know what memory was requested by each program, and release it when the program ends? If I run that program a lot of times, will I run out of memory?

回答1:

On operating systems with protected memory (Mac OS 10+, all Unix-clones such as Linux, and NT-based Windows systems meaning Windows 2000 and younger), the memory gets released when the program ends.

If you run any program often enough without closing it in between (running more and more instances at the same time), you will eventually run out of memory, regardless of whether there is a memory leak or not, so that's also true of programs with memory leaks. Obviously, programs leaking memory will fill the memory faster than an identical program without memory leaks, but how many times you can run it without filling the memory depends much rather on how much memory that program needs for normal operation than whether there's a memory leak or not. That comparison is really not worth anything unless you are comparing two completely identical programs, one with a memory leak and one without.

Memory leaks become the most serious when you have a program running for a very long time. Classic examples of this is server software, such as web servers. With games or spreadsheet programs or word processors, for instance, memory leaks aren't nearly as serious because you close those programs eventually, freeing up the memory. But of course memory leaks are nasty little beasts which should always be tackled as a matter of principle.

But as stated earlier, all modern operating systems release the memory when the program closes, so even with a memory leak, you won't fill up the memory if you're continuously opening and closing the program.



回答2:

No, in all practical operating systems, when a program exits, all its resources are reclaimed by the OS. Memory leaks become a more serious issue in programs that might continue running for an extended time and/or functions that may be called often from the same program.



回答3:

Leaked memory is returned by the OS after the execution has stopped.

That's why it isn't always a big problem with desktop applications, but its a big problem with servers and services (they tend to run long times.).

Lets look at the following scenario:

  1. Program A ask memory from the OS
  2. The OS marks the block X as been used by A and returns it to the program.
  3. The program should have a pointer to X.
  4. The program returns the memory.
  5. The OS marks the block as free. Using the block now results in a access violation.
  6. Program A ends and all memory used by A is marked unused.

Nothing wrong with that.

But if the memory is allocated in a loop and the delete is forgotten, you run into real problems:

  1. Program A ask memory from the OS
  2. The OS marks the block X as been used by A and returns it to the program.
  3. The program should have a pointer to X.
  4. Goto 1

If the OS runs out of memory, the program probably will crash.



回答4:

No. Once the OS finishes closing the program, the memory comes back (given a reasonably modern OS). The problem is with long-running processes.



回答5:

When the process ends, the memory gets cleared as well. The problem is that if a program leaks memory, it will requests more and more of the OS to run, and can possibly crash the OS.



回答6:

It's more leaking in the sense that the code itself has no more grip on the piece of memory.



回答7:

The OS can release the memory when the program ends. If a leak exists in a program then it is just an issue whilst the program is running. This is a problem for long running programs such as server processes. Or for example, if your web browser had a memory leak and you kept it running for days then it would gradually consume more memory.



回答8:

As far as I know, on most OS when a program is started it receives a defined segment of memory which will be completely liberated once the program is ended.

Memory leaks are one of the main reason why garbage collector algorithms were invented since, once plugged into the runtime, they become responsible in reclaiming the memory that is no longer accessible by a program.



回答9:

Memory leaks don't persist past end of execution so a "solution" to any memory leak is to simply end program execution. Obviously this is more of an issue on certain types of software. Having a database server which needs to go offline every 8 hours due to memory leaks is more of an issue than a video game which needs to be restarted after 8 hours of continual play.

The term "leak" refers to the fact that over time memory consumption will grow without any increased benefit. The "leaked" memory is memory neither used by the program nor usable by the OS (and other programs).

Sadly memory leaks are very common in unmanaged code. I have had firefox running for a couple days now and memory usage is 424MB despite only having 4 tabs open. If I closed firefox and re-opened the same tabs memory usage would likely be <100MB. Thus 300+ MB has "leaked".