alastairtree/lazycache

Lazy cache is a simple in-memory caching service

To install LazyCache, run the following command in the quick start wiki

Lazy cache is a simple in-memory caching service

Lazy Cache

Lazy cache is a simple in-memory caching service. It has a developer friendly generics based API, and provides a thread safe cache implementation that guarantees to only execute your cachable delegates once (it's lazy!). Under the hood it leverages Microsoft.Extensions.Caching and Lazy to provide performance and reliability in heavy load scenarios.

Download

LazyCache is available using nuget. To install LazyCache, run the following command in the Package Manager Console

Quick start

See the quick start wiki

Sample code

As you can see the magic happens in the GetOrAdd() method which gives the consumer an atomic and tidy way to add caching to your code. It leverages a factory delegate Func and generics to make it easy to add cached method calls to your app.

It means you avoid the usual "Check the cache - execute the factory function - add results to the cache" pattern, saves you writing the double locking cache pattern and means you can be a lazy developer!

What should I use it for?

LazyCache suits the caching of database calls, complex object graph building routines and web service calls that should be cached for performance. Allows items to be cached for long or short periods, but defaults to 20 mins.

.Net framework and dotnet core support?

The latest version targets netstandard 2.0. See .net standard implementation support

For dotnet core 2, .net framwork net461 or above, netstandard 2+, use LazyCache 2 or above.

For .net framework without netstandard 2 support such as net45 net451 net46 use LazyCache 0.7 - 1.x

For .net framework 4.0 use LazyCache 0.6

Features

  • Simple API with familiar sliding or absolute expiration
  • Guaranteed single evaluation of your factory delegate whose results you want to cache
  • Strongly typed generics based API. No need to cast your cached objects every time you retrieve them
  • Stops you inadvertently caching an exception by removing Lazys that evaluate to an exception
  • Thread safe, concurrency ready
  • Async compatible - lazy single evaluation of async delegates using GetOrAddAsync()
  • Interface based API and built in MockCache to support test driven development and dependency injection
  • Leverages a provider model on top of IMemoryCache under the hood and can be extended with your own implementation
  • Good test coverage

Documentation

  • The wiki
  • Adding caching to a .net application and make it faster

Sample Application

See CacheDatabaseQueriesApiSample for an example of how to use LazyCache to cache the results of an Entity framework query in a web api controller. Watch how the cache saves trips to the database and results are returned to the client far quicker from the in-memory cache

Contributing

If you have an idea or want to fix an issue please open an issue on Github to discuss it and it will be considered.

If you have code to share you should submit a pull request: fork the repo, then create a branch on that repo with your changes, when you are happy create a pull Request from your branch into LazyCache master for review. See https://help.github.com/en/articles/creating-a-pull-request-from-a-fork.

LazyCache is narrow in focus and well established so unlikely to accept massive changes out of nowhere but come talk about on GitHub and we can all collaborate on something that works for everyone. It is also quite extensible so you may be able to extend it in your project or add a companion library if necessary.

Issues

Quick list of the latest Issues we found

Coder3333

Coder3333

bug
Icon For Comments2

I expect CachingService.TryGetValue to return the value that I stored in the cache, but instead, it returns the Lazy that was used to generate the value. I would expect a call to GetValueFromLazy in this method to make sure the right object is returned. You will see this behavior if you use GetOrCreate to initially store the value, and then follow up with TryGetValue to read the value.

Also, because this re-uses T when calling CacheProvider.TryGetValue, it never finds the value. You would need to use CacheProvider.TryGetValue to be able to fetch the value.

Here is the TryGetValue method from https://github.com/alastairtree/LazyCache/blob/master/LazyCache/CachingService.cs.

` public virtual bool TryGetValue(string key, out T value) { ValidateKey(key);

return CacheProvider.TryGetValue(key, out value); }

`

phadtrapong

phadtrapong

question
Icon For Comments2

Hi guys, Does anyone know if currently this library can allow multiple threads to keep read data from cache, while some other threads update cache existing key with new value ? or multiple threads will be blocked when try to read before new value become available.

laukoksoon

laukoksoon

bug
Icon For Comments1

Describe the bug My goal here is to cache some long running process data and then upon expiration, i will get those data that going to expired and append new delta changes from database The long running process shouldn't run more than one time

To Reproduce I will attach the console program that i wrote to reproduce the issue

When i set int parallelNumber = 100; // 100 or below => All the thing run as EXPECTED

When i set int parallelNumber = 300; // 300 or allow => Long process called more than once .. [NOT OK]

Expected behavior Long process should call ONE even the parallel count set to 300 above

** Framework and Platform

  • OS: Windows 10
  • Framework v4.6.2
  • LazyCache Version 2.4

Console source code

using LazyCache; using Microsoft.Extensions.Caching.Memory; using System; using System.Collections.Generic; using System.Linq; using System.Threading; using System.Threading.Tasks;

namespace ConsoleApp1 { class program { private static IAppCache _lazyCache; private static int _totalLongProcess; private static int _totalAppendProcess;

static List<int> LongProcess(string key) { _totalLongProcess++;
    <span>if</span> (_totalLongProcess &gt; <span>1</span>)
    {
        <span>throw</span> <span>new</span> Exception(<span>"Not suppose to run more than one time"</span>);
    }

    Console.WriteLine(<span>"LONG PROCESS for key "</span> + key);
    Thread.Sleep(<span>5</span> * <span>1000</span>);
    <span>return</span> Enumerable.Range(<span>0</span>, <span>100</span>).ToList();
}

<span><span>static</span> List&lt;<span>int</span>&gt; <span>AppendElement</span>(<span><span>string</span> key, List&lt;<span>int</span>&gt; ori</span>)</span>
{
    _totalAppendProcess++;
    Console.WriteLine(<span>"Append Process for key "</span> + key);
    Thread.Sleep(<span>2</span> * <span>1000</span>);

    <span>var</span> newList = <span>new</span> List&lt;<span>int</span>&gt;();
    newList.Add(<span>12345</span>);
    newList.Add(<span>54321</span>);

    ori.AddRange(newList);
    <span>return</span> ori;
}

<span><span>static</span> List&lt;<span>int</span>&gt; <span>GetCacheByKey</span>(<span><span>string</span> key, <span>string</span> threadId</span>)</span>
{
    Console.WriteLine(<span>$"Request key <span>{key}</span>, threadId <span>{threadId}</span>"</span>);
    <span>var</span> result = _lazyCache.GetOrAdd(key, () =&gt; LongProcess(key), GetOptions());
    <span>return</span> result;
}

<span><span>static</span> MemoryCacheEntryOptions <span>GetOptions</span>(<span></span>)</span>
{
    <span>//ensure the cache item expires exactly on 30s (and not lazily on the next access)</span>
    <span>var</span> options = <span>new</span> LazyCacheEntryOptions()
        .SetAbsoluteExpiration(TimeSpan.FromSeconds(<span>10</span>), ExpirationMode.ImmediateExpiration);
                
    <span>// as soon as it expires, re-add it to the cache</span>
    options.RegisterPostEvictionCallback((keyEvicted, <span>value</span>, reason, state) =&gt;
    {
        <span>// dont re-add if running out of memory or it was forcibly removed</span>
        <span>if</span> (reason == EvictionReason.Expired || reason == EvictionReason.TokenExpired)
        {
            <span>var</span> oriList = <span>value</span> <span>as</span> List&lt;<span>int</span>&gt;;
            _lazyCache.GetOrAdd(keyEvicted.ToString(), _ =&gt; AppendElement(keyEvicted.ToString(), oriList), GetOptions()); <span>//calls itself to get another set of options!</span>
        }
    });
    <span>return</span> options;
}

<span><span>static</span> <span>void</span> <span>Main</span>(<span><span>string</span>[] args</span>)</span>
{
    _lazyCache = <span>new</span> CachingService();

    <span>int</span> round = <span>1</span>;
    <span>int</span> i = <span>0</span>;
    <span>while</span> (i &lt; round)
    {
        <span>int</span> parallelNumber = <span>300</span>;
        Parallel.For(<span>0</span>, parallelNumber, count =&gt;
        {
            Thread.Sleep(<span>1</span> * <span>1000</span>);
            <span>var</span> list = GetCacheByKey(<span>"1"</span>, Thread.CurrentThread.ManagedThreadId.ToString());
            Console.WriteLine(<span>$"Got result for key <span>{<span>"1"</span>}</span>, threadId <span>{Thread.CurrentThread.ManagedThreadId.ToString()}</span>, count <span>{list.Count}</span>"</span>);
        });

        i++;
    }

    Console.WriteLine(<span>"Total long process run: "</span> + _totalLongProcess);
    Console.WriteLine(<span>"Total append process run: "</span> + _totalAppendProcess);
    Console.ReadLine();
}

}

}

judahr

judahr

enhancement
Icon For Comments0

[Use the Thumbs Up reaction to vote for this feature, and please avoid adding comments like "+1" as they create noise for others watching the issue.]

Problem: To remove expired items from the cache requires either explicitly doing it or requiring the system to execute a timer. A timer can be resource expensive. Keeping track of what needs to be removed duplicates the role of the cache expiration policy.

Solution 1: Implementing the MemoryCache.Trim() function would be a nice in-between allowing a program to call a single method to trim the size of a possibly overgrown cache. This allows the underlying cache to perform the work with information it is already tracking. This would not be another ExpirationMode.

Solution 2: Implement an explicit call that will tell the provider to go through and remove items that are expired. Rather than a timer, this can be called less frequently based on when the client application knows it is idle or has completed tasks. This would not be another ExpirationMode. It would simply expose to the client app what the ImmediateExpiration is already doing, but less frequently.

Background: Iterating over data and caching expensive calculation results so later iterations can use it. Because the data and calculations are time series, the actual cache duration can be short and the application can perform a trim at intervals based on the data.

PatryxCShark

PatryxCShark

bug
Icon For Comments1

Describe the bug Absolute expiration seems to not work. In unit tests (C#, Windows) it's ok but after deploying on Linux server, each time function to load data is called.

To Reproduce

Let's say class: MyCachingManager:

And example of using:

Any idea why every time addItemFactory is called (LoadMyData)? When I used MemoryCache there was no problem and it worked as expected (expiry every 3 minutes).

Expected behavior Entry in Lazy Cache should expiry 3 minutes after last loading data into cache.

Framework and Platform

  • OS: Windows/Linux (deploy)
  • Framework: net5.0
  • LazyCache.AspNetCore Version: 2.4.0
b-twis

b-twis

enhancement
Icon For Comments1

[Use the Thumbs Up reaction to vote for this feature, and please avoid adding comments like "+1" as they create noise for others watching the issue.]

Firstly, actively using your library in multiple places for quite some time and very happy with it.

In the recent inclusion of PR #142 when adding TryGetValue I dont feel like it fits the theme of the library as it does not perform GetValueFromAsyncLazy so it could give you back a factory (AsyncLazy) instead of the actual value

I read the comments and the end goal was something close to a cache.contains (and you can not out param an async method), but would be great it is also fit normal pattern of the other Gets.

I did something similar to below using a wrapper class to handle a similar issue elsewhere.

Here is an attempt of it in LazyCache.

Thoughts? I can make look at making a PR if you are interested.

Thanks,

Basil

kkd83

kkd83

enhancement
Icon For Comments0

[Use the Thumbs Up reaction to vote for this feature, and please avoid adding comments like "+1" as they create noise for others watching the issue.]

Is your feature request related to a problem? Please describe. Dependency injected I have an IAppCache which I can add items to but I have to build the key so there is no possible clash with another. This is usually fine but would be better if I could define a section dedicated to my types without worry of a clash.

Describe the solution you'd like I'm proposing adding an extension IAppCache GetSection(this IAppCache cache, string section) which would prepend the section to every key automatically.

This would be a very small non-breaking change.

Describe alternatives you've considered None that I can think of.

Additional context None.

thenerdynerd

thenerdynerd

Icon For Comments12

There appears to be a null value returned from the out parameter when using the trygetvalue method.

The code appears to find the value in the cache if this is defined but a try get does not return a value and thus unable to be used.

Is this a bug?

ExLuzZziVo

ExLuzZziVo

bug
Icon For Comments7

Describe the bug Hi, I am trying to use ExpirationMode.ImmediateEviction with sliding expiration in my project and this makes the entry to update every time I access it, ignoring the sliding expiration value

To Reproduce

Expected behavior The sliding expiration time shouldn't be ignored

** Framework and Platform

  • OS: Windows 10 21H1
  • Framework: netcoreapp3.1
  • LazyCache Version: 2.1.3
EnricoMassone

EnricoMassone

Icon For Comments7

Can you please explain why adding a null item to the cache via IAppCache.Add is disallowed ?

I'm asking because the underlying ASP.NET core memory cache allows to add null items to the cache via IMemoryCache.Set.

Is this design choice intended to avoid a situation like the following one, where the semantic of the null reference is somewhat ambiguous ?

Thanks for the clarification.

r-bennett

r-bennett

bug
Icon For Comments1

Describe the bug DefaultCacheDurationSeconds is being ignored in some code paths. It only appears to be used in AppCacheExtensions e.g. extension https://github.com/alastairtree/LazyCache/blob/e38695bf63b1d33d97032e995d7dd5609dd692b3/LazyCache/AppCacheExtensions.cs#L74 but not in the almost equivalent instance method https://github.com/alastairtree/LazyCache/blob/33d055c75b455eacf62add2ca41182b541b47981/LazyCache/CachingService.cs#L182

To Reproduce

Expected behavior A new value generated for each call, with output e.g.

Actual behavior Same value reused for "foo", with output e.g.

Framework and Platform

  • OS: Windows 10
  • Framework net5.0
  • LazyCache Version 2.1.3

Additional context Similar to the report in https://github.com/alastairtree/LazyCache/issues/121, although this isn't because of lazy eviction

Temtaime

Temtaime

enhancement
Icon For Comments4

Hello. To continue closed #95 issue. Your point that one can manually call Remove/Add is wrong. The whole point of this library is to provide thread-safe and atomic way to cache something evaluating a callback only once. AddOrUpdate cannot be emulated with custom Remove/Add because it requires additional synchronization. It can be useful when one know that element must be updated and such a request must be processed atomically. Update callback must be provided with previous value and evaluated once.

Edminsson

Edminsson

enhancement
Icon For Comments1

Hi, It seems to me that AsyncLazy always runs the factory on the Thread Pool even when the factory contains IO-bound code. Perhaps I'm missing something but this does not seem very efficient. I believe Stephen Cleary has an implementation of AsyncLazy with a flag (AsyncLazyFlags.ExecuteOnCallingThread) that gives you the option to run the factory on the calling thread thread without calling Task.Factory.StartNew or Task.Run.

vanepsm

vanepsm

enhancement
Icon For Comments3

[Use the Thumbs Up reaction to vote for this feature, and please avoid adding comments like "+1" as they create noise for others watching the issue.]

Currently I'm using LazyCache with MemoryCache. The way this is implemented it looks like I need to add a new parameter to the cache.GetOrAdd() call. This doesn't have visibility to the section of code that is fetching the entry, so I'm not able to add a size based on the size of the entry. I can only create a static size for every entry in that call.

Something like:

svengeance

svengeance

enhancement
Icon For Comments5

[Use the Thumbs Up reaction to vote for this feature, and please avoid adding comments like "+1" as they create noise for others watching the issue.]

I think in the longer term, LazyCache would benefit greatly from having additional cache providers. There's two immediate ones that come to mind automatically -

FileCacheProvider - by storing metadata (expiration, weight, etc) in addition to the serialized content itself, this provider is capable of caching data between app executions.

FastMemoryCacheProvider - By discarding all the fripperies of dotnet's MemoryCache, it should be possible to build a provider dedicated to lowest possible allocations and fastest performance. This cache provider would be useful for those who are using cache in a hotpath, specifically where calculations that takes milliseconds are worth caching (and the cost of the cache is low nanoseconds).

@alastairtree I'm wondering if you had any ideas on an API to support additional providers. Ideally, users should be able to seamlessly switch between the providers.

Three possible approaches

1: Generic IAppCache

  • Users can configure one IOptions for each provider in Startup
  • Users can configure a default provider, when the non-generic IAppCache is injected
  • Users can inject either IAppCache or IAppCache<TProvder>.
  • Injecting IAppCache<TProvider> will cause LazyCache to provide a shared instance of that TProvider (which implements IAppCache) to the service

2: CacheFactory (see IHttpClientFactory)

  • Users can configure one IOptions for each provider in Startup

  • Users can inject ICacheFactory into their services

  • In the constructor of these services, users can create instances of their caches, e.x.

  • Users can alter the cache provider's options at the point of creation through Create overloads

  • Gives way to future support for "region" caches, or "named" caches

3: Direct Provider Injection

  • Users can configure one IOptions for each provider in Startup
  • Users can directly inject a provider in the constructor e.x.

To be honest I like the second option the best. It's a step away from the current implementation, but adding one layer of abstraction gives the user a whole lot of flexibility. We can also then provide an implementation if IMockCacheFactory, which always returns mocks.

TLDR

I may be off-base here entirely if this has already been thought out, but here are 3 implementation ideas for how we can support multiple providers in the future.

joakimriedel

joakimriedel

enhancement
Icon For Comments2

[Use the Thumbs Up reaction to vote for this feature, and please avoid adding comments like "+1" as they create noise for others watching the issue.]

I'm looking for a library to cache values at certain hot paths in my code.

Browsing through your API, I see that the async methods are returning Task and not ValueTask. The latter would be preferred since most of the requests would be cached and could be returned as T and not require the overhead of generating an actual Task<T> object.

Enhancement is to change GetAsync<T>, GetOrAddAsync<T> to return ValueTask<T>, but would require changing code all the way down to where T is actually returned in the providers to support ValueTask<T>. Perhaps for 3.0?

alastairtree

alastairtree

enhancement
Icon For Comments0

[Use the Thumbs Up reaction to vote for this feature, and please avoid adding comments like "+1" as they create noise for others watching the issue.]

To help users add some nice documentation XML comments to the main IAppCache interface and it's extensions IAppCacheExtensions to guide developers on how to use those methods. PRs VERY welcome!

alastairtree

alastairtree

enhancement
Icon For Comments0

[Use the Thumbs Up reaction to vote for this feature, and please avoid adding comments like "+1" as they create noise for others watching the issue.]

The unit tests in LazyCache have got bit messy over time. They need a tidy up so that they are grouped better and arranged into different files based on what they are testing. Please help and send a PR!

alastairtree

alastairtree

enhancement
Icon For Comments0

[Use the Thumbs Up reaction to vote for this feature, and please avoid adding comments like "+1" as they create noise for others watching the issue.]

LazyCache currently uses statics for some config such as default cache duration because of the age of the project. Remove all statics and instead use the more modern Options pattern with a strongly types options object which is more common in dotnet core. See https://docs.microsoft.com/en-us/aspnet/core/fundamentals/configuration/options?view=aspnetcore-3.1

CachingService should add a constructor dependency on IOptions<LazyCacheOptions> options. LazyCacheOptions should have a properties for DefaultCacheDuration and NumberOfKeyLocks and anything else configurable.

This is a breaking change - requires a major version increase

granadacoder

granadacoder

enhancement
Icon For Comments0

[Use the Thumbs Up reaction to vote for this feature, and please avoid adding comments like "+1" as they create noise for others watching the issue.]

Is your feature request related to a problem? Please describe.

Hi.

https://github.com/alastairtree/LazyCache/blob/master/LazyCache.AspNetCore/LazyCacheServiceCollectionExtensions.cs

Non asp.net core applications (dotnet core console applications) could benefit from these IoC registrations.

Describe the solution you'd like

Isolate the below .cs/functionality

LazyCacheServiceCollectionExtensions.cs

to a different csproj that has no names/dependencies on "AspNetCore"

LazyCache.Extensions

or

LazyCache.DependencyInjection

or ????

Sn3akyP3t3

Sn3akyP3t3

enhancement
Icon For Comments1

[Use the Thumbs Up reaction to vote for this feature, and please avoid adding comments like "+1" as they create noise for others watching the issue.]

Is your feature request related to a problem? Please describe. No

Describe the solution you'd like Neither documentation or the example source code show examples of saving the cache to disk in between running of the application. I don't know if this is an existing feature that just lacks documentation or neither. Whichever the case this is a feature request to fill that gap or gaps.

Describe alternatives you've considered Finding another library that can save the content of an object to disk like Python Pickle maybe...?

Additional context N/A

replaysMike

replaysMike

bug
Icon For Comments0

Trying to write a Redis provider for the current version of LazyCache. Ended up hitting a wall with the current implementation.

The bug When calling _cache.GetOrAddAsync() the caching service calls the GetOrCreate on the underlying provider instead of GetOrCreateAsync. This prevents the task from being unwrapped correctly and the resulting object is wrong.

CachingService.cs:172 and CachingService.cs:192.

Couldn't find an easy way around it because of the way the CacheFactory is used, and passes the type as an object

ArnaudB88

ArnaudB88

question
Icon For Comments13

I want to cache the result of a method with LazyCache. That method uses other classes which are instantiated with dependency injection. Instances of that class are created with a lifetime manager using the HTTPContext. In short, the 'addItemFactory' method which I want to cache the result of, needs the HttpContext.

It seems that when the execution is started of the async method, the HTTP context is lost. The point where it loses the context is the contructor:

When starting a new task (Factory.StartNew()), the context is lost.

doaji

doaji

question
Icon For Comments5

Hello Alastair,

I am currently using the 0.7.1.44 version in my .net application. i added to some logging inside my Db call method and i noticed that Db calls are being made rather frequently. I currently have a sliding expiration policy for 7 days, but the method is called multiple times same day.

here is a portion of today's logs

2020-01-16 00:07:41.7333 INFO Calling ActivityFeedDB: 5a10c9e6-eb2c-41f0-9fc8-20c0fe6643f6:POSTS 2020-01-16 00:22:51.7508 INFO Calling ActivityFeedDB: 5a10c9e6-eb2c-41f0-9fc8-20c0fe6643f6:POSTS 2020-01-16 00:34:34.8084 INFO Calling ActivityFeedDB: 5a10c9e6-eb2c-41f0-9fc8-20c0fe6643f6:POSTS 2020-01-16 01:30:59.2897 INFO Calling ActivityFeedDB: 5a10c9e6-eb2c-41f0-9fc8-20c0fe6643f6:POSTS 2020-01-16 03:31:42.4026 INFO 1/16/2020 3:31:42 AM Removing Cache: CachedActivityFeed:5a10c9e6-eb2c-41f0-9fc8-20c0fe6643f6-POSTS CacheSpecificEviction 2020-01-16 11:27:02.1429 INFO Calling ActivityFeedDB: 5a10c9e6-eb2c-41f0-9fc8-20c0fe6643f6:POSTS 2020-01-16 14:37:11.0202 INFO Calling ActivityFeedDB: 5a10c9e6-eb2c-41f0-9fc8-20c0fe6643f6:POSTS

these are snippets of my code:

private static readonly CacheItemPolicy policy = new CacheItemPolicy() { Priority = CacheItemPriority.Default, RemovedCallback = new CacheEntryRemovedCallback(Factory.OnCacheExpired), SlidingExpiration = new TimeSpan(7, 0, 0, 0, 0) };
public static CachedActivityFeed <span>GetItem(Guid? <span>activityid</span>, <span>int</span>? <span>id</span>, Site_Activities.Activities <span>type</span>)</span>
{
    <span>try</span>
    {
        return Factory.<span>GetOrAdd(GetCacheID(<span>activityid</span>, <span>id</span>, <span>type</span>)</span>,<span> <span>()</span> =&gt;</span> <span>Construct(<span>activityid</span>, <span>id</span>, <span>type</span>)</span>, policy);
    }
    catch (Exception)
    {

        return null;
    }

}

public static T GetOrAdd&lt;T&gt;(<span>string</span> key, Func&lt;T&gt; func, CacheItemPolicy cacheItemPolicy)
{
    var result = <span><span><span>TempCache</span>.</span></span>cache.<span>GetOrAdd(<span>key</span>, <span>func</span>, <span>cacheItemPolicy</span>)</span>;
    return result;
}

public static IAppCache cache = new CachingService();

Am i doing something wrong implementation wise? or is this a know bug thats fixed with an upgrade?

Great Lib!

Versions

Quick list of the latest released versions

2.0.0-beta02 - Mar 04, 2018

2.0.0-beta01 - Mar 04, 2018

Library Stats (Sep 01, 2022)

Subscribers: 43
Stars: 1.4K
Forks: 137
Issues: 41

CSharpMinifier filters comments and unnecessary whitespace from valid C#

source code in order to arrive at a compressed form without changing the

CSharpMinifier filters comments and unnecessary whitespace from valid C#

CSharpFastPFOR: A C# port of the simple integer compression library JavaFastPFOR

==========================================================

CSharpFastPFOR: A C# port of the simple integer compression library JavaFastPFOR

csharp-data-visualization

I've always wanted to learn how to visualize data in C#

csharp-data-visualization

CSharpToCppTranslator

A specific translator for LinksPlatform's libraries

CSharpToCppTranslator

CSharpDecodeSdpc

This is a tool to extract image tiles from pathological whole slide images (WSIs) based on C#

CSharpDecodeSdpc

C Sharp Helper Methods

Bu bir Windows Form uygulamas覺d覺r ve i癟erisinde genel olarak ERP projelerinde s覺k癟a kullan覺labilecek baz覺 metotlar覺 ve kullan覺mlar覺n覺 i癟ermektedir

C Sharp Helper Methods

CSharp-CodeSnippet

Wide variety of sample code snippets from the topics related in C#

CSharp-CodeSnippet

CSharp &quot;C#&quot; WAVE &quot;

Parses the audio data and the format chunk info from a WAVE-Format audio file &quot;

CSharp &quot;C#&quot; WAVE &quot;

CSharp_ChromaStreamApp

C# Chroma Stream App for Chroma RGB streaming

CSharp_ChromaStreamApp

CSharp-SMTP-Server

Simple (receive only) SMTP server library for C#

CSharp-SMTP-Server

CSharp To Mindustry Logic

This is a code transpiler that will transpile C# code to mlog

CSharp To Mindustry Logic