Let's say I have a large data array updated 1000+ times per second.
Another application wants to access and read the array in a short interval. Both applications are on the same machine.
I have tried using WCF for interprocess communication, but serializing and sending the whole array (or a large object) thousands of times per second is unfeasible performance wise.
Is there a way to directly access objects from different applications in c#?
There are a few IPC technologies you can use that though pre-date WCF are still relevant today.
Pipes
Pipes is one such technology. It's binary, runs in Kernel mode and very fast! Though it's quite low-level and does not give access to "objects".
.NET Remoting
.NET Remoting will give access to objects but is perhaps not as fast as pipes.
Both pipes and .NET remoting are faster than serialization-technologies WCF which converts things to verbose XML/SOAP.
COM
COM is a binary protocol for IPC. COM is a client server model where the client requests data from the COM or OLE server. The beauty about COM is that you have direct access to objects in the server - they are not serialised. For example requesting an element in a SAFEARRAY.
A
SAFEARRAY
is an Automation-safe structure of arbitrary dimensions consisting of type-safe data. Luckily .NET will hide the SAFEARRAY gobble-de-gook for us.In my example I have created a
Manager
class that will expose the array. To get to theManager
I've used a factory pattern so thatManager
is essentially a singleton.You should lay out your project as follows:
Factory
,Manager
First the contracts:
Now for the factory pattern:
The manager:
A test app. This should only reference the MyComLib.Contracts.dll and not MyComLib.dll.
One final step is to change this in-process COM server to an out-of-process COM server so that multiple processes each share the same
Manager
and don't create their own singletons. In otherwords, a singleton that spans processes. When theManager
is running, it is essentially in it's own process space separate from all the other client processes.For that you'll need to configure a COM surrogate which is explained in detail here.
File Mapping/Shared Memory
Lastly, File Mapping allows you to manipulate a file as if it were nothing more than a large block of memory in the process's address space. No fiddly file seek; read/write operations. Just grab a pointer to the memory block and start reading/writing. The system will do the rest.
MSDN:
Sadly, it still does require you to write your data in the first place and for it to be most effective you would need to change your application to treat the memory block as the source of truth rather than your array in memory. Otherwise you'll be serializing all the time.
However, shared memory via the swap file does technically allow you to eliminate any serialize-de-serialize between your client-server apps and duplication of data "on the heap". Though as I said you may need to adjust your app to work with raw memory buffers rather than objects.
Tell me more
NOTE: Contrary to popular belief, .NET Remoting is not entirely obsolete. One contemory use for it is communication between objects in different
AppDomains
within the same process, something you typically do inplug-in systems
.