linq2db/linq2db

LINQ to DB is the fastest LINQ database access library offering a simple, light, fast,...

Development version nuget how to use)

LINQ to DB is the fastest LINQ database access library offering a simple, light, fast,...

LINQ to DB

LINQ to DB is the fastest LINQ database access library offering a simple, light, fast, and type-safe layer between your POCO objects and your database.

Architecturally it is one step above micro-ORMs like Dapper, Massive, or PetaPoco, in that you work with LINQ expressions, not with magic strings, while maintaining a thin abstraction layer between your code and the database. Your queries are checked by the C# compiler and allow for easy refactoring.

However, it's not as heavy as LINQ to SQL or Entity Framework. There is no change-tracking, so you have to manage that yourself, but on the positive side you get more control and faster access to your data.

In other words LINQ to DB is type-safe SQL.

linq2db is a .NET Foundation project.

Development version nuget feed (how to use)

Standout Features

  • Rich Querying API:
    • Explicit Join Syntax (In addition to standard LINQ join syntax.)
    • CTE Support
    • Bulk Copy/Insert
    • Window/Analytic Functions
    • Merge API
  • Extensibility:
    • Ability to Map Custom SQL to Static Functions

Visit our blog and see Github.io documentation for more details.

Code examples and demos can be found here or in tests.

Release Notes page.

Related linq2db and 3rd-party projects

  • linq2db.EntityFrameworkCore (adds support for linq2db functionality in EF.Core projects)
  • LINQPad Driver
  • DB2 iSeries Provider
  • ASP.NET CORE 2 Template
  • ASP.NET CORE 3 Template with Angular
  • ASP.NET CORE 5 Template
  • PostGIS extensions for linq2db

Notable open-source users:

  • nopCommerce (starting from v4.30) - popular open-source e-commerce solution
  • OdataToEntity - library to create OData service from database context
  • SunEngine - site, blog and forum engine

Unmantained projects:

  • LinqToDB.Identity - ASP.NET Core Identity provider using linq2db
  • IdentityServer4.LinqToDB - IdentityServer4 persistence layer using linq2db

How to help the project

No, this is not the donate link. We do need something really more valuable - your time. If you really want to help us please read this post.

Let's get started

From NuGet:

  • Install-Package linq2db

Configuring connection strings

Using Connection Options Builder

You can configure connection options from code using LinqToDbConnectionOptionsBuilder class (check class for available options):

// create options builder
var builder = new LinqToDbConnectionOptionsBuilder();

// configure connection string
builder.UseSqlServer(connectionString);

// or using custom connection factory
b.UseConnectionFactory(
    SqlServerTools.GetDataProvider(
        SqlServerVersion.v2017,
        SqlServerProvider.MicrosoftDataSqlClient),
    () =>
    {
        var cn = new SqlConnection(connectionString);
        cn.AccessToken = accessToken;
        return cn;
    });

// pass configured options to data connection constructor
var dc = new DataConnection(builder.Build());

Using Config File (.NET Framework)

In your web.config or app.config make sure you have a connection string (check this file for supported providers):

<connectionStrings>
  <add name="Northwind" 
    connectionString = "Server=.\;Database=Northwind;Trusted_Connection=True;Enlist=False;" 
    providerName     = "SqlServer" />
</connectionStrings>

Using Connection String Settings Provider

.Net Core does not support System.Configuration until 3.0 so to configure connection strings you should implement ILinqToDBSettings, for example:

public class ConnectionStringSettings : IConnectionStringSettings
{
    public string ConnectionString { get; set; }
    public string Name { get; set; }
    public string ProviderName { get; set; }
    public bool IsGlobal => false;
}

public class MySettings : ILinqToDBSettings
{
    public IEnumerable<IDataProviderSettings> DataProviders => Enumerable.Empty<IDataProviderSettings>();

    public string DefaultConfiguration => "SqlServer";
    public string DefaultDataProvider => "SqlServer";

    public IEnumerable<IConnectionStringSettings> ConnectionStrings
    {
        get
        {
            yield return
                new ConnectionStringSettings
                {
                    Name = "Northwind",
                    ProviderName = "SqlServer",
                    ConnectionString = @"Server=.\;Database=Northwind;Trusted_Connection=True;Enlist=False;"
                };
        }
    }
}

And later just set on program startup before the first query is done (Startup.cs for example):

DataConnection.DefaultSettings = new MySettings();

ASP.NET Core

See article.

Now let's create a POCO class

Important: you also can generate those classes from your database using T4 templates. Demonstration video could be found here.

using System;
using LinqToDB.Mapping;

[Table(Name = "Products")]
public class Product
{
  [PrimaryKey, Identity]
  public int ProductID { get; set; }

  [Column(Name = "ProductName"), NotNull]
  public string Name { get; set; }

  // ... other columns ...
}

At this point LINQ to DB doesn't know how to connect to our database or which POCOs go with what database. All this mapping is done through a DataConnection class:

public class DbNorthwind : LinqToDB.Data.DataConnection
{
  public DbNorthwind() : base("Northwind") { }

  public ITable<Product> Product => GetTable<Product>();
  public ITable<Category> Category => GetTable<Category>();

  // ... other tables ...
}

We call the base constructor with the "Northwind" parameter. This parameter (called configuration name) has to match the name="Northwind" we defined above in our connection string. We also have to register our Product class we defined above to allow us to write LINQ queries.

And now let's get some data:

using LinqToDB;
using LinqToDB.Common;

public static List<Product> All()
{
  using (var db = new DbNorthwind())
  {
    var query = from p in db.Product
                where p.ProductID > 25
                orderby p.Name descending
                select p;
    return query.ToList();
  }
}

Make sure you always wrap your DataConnection class (in our case DbNorthwind) in a using statement. This is required for proper resource management, like releasing the database connections back into the pool. More details

Selecting Columns

Most times we get the entire row from the database:

from p in db.Product
where p.ProductID == 5
select p;

However, sometimes getting all the fields is too wasteful so we want only certain fields, but still use our POCOs; something that is challenging for libraries that rely on object tracking, like LINQ to SQL.

from p in db.Product
orderby p.Name descending
select new Product
{
  Name = p.Name
};

Composing queries

Rather than concatenating strings we can 'compose' LINQ expressions. In the example below the final SQL will be different if onlyActive is true or false, or if searchFor is not null.

public static List<Product> All(bool onlyActive, string searchFor)
{
  using (var db = new DbNorthwind())
  {
    var products = from p in db.Product 
                   select p;

    if (onlyActive)
    {
      products = from p in products 
                 where !p.Discontinued 
                 select p;
    }

    if (searchFor != null)
    {
      products = from p in products 
                 where p.Name.Contains(searchFor) 
                 select p;
    }

    return products.ToList();
  }
}

Paging

A lot of times we need to write code that returns only a subset of the entire dataset. We expand on the previous example to show what a product search function could look like.

Keep in mind that the code below will query the database twice. Once to find out the total number of records, something that is required by many paging controls, and once to return the actual data.

public static List<Product> Search(string searchFor, int currentPage, int pageSize, out int totalRecords)
{
  using (var db = new DbNorthwind())
  {
    var products = from p in db.Product 
                   select p;

    if (searchFor != null)
    {
      products = from p in products 
                 where p.Name.Contains(searchFor) 
                 select p;
    }

    totalRecords = products.Count();

    return products.Skip((currentPage - 1) * pageSize).Take(pageSize).ToList();
  }
}

Joins

This assumes we added a Category class, just like we did with the Product class, defined all the fields, and registered it in our DbNorthwind data access class. We can now write an INNER JOIN query like this:

from p in db.Product
join c in db.Category on p.CategoryID equals c.CategoryID
select new Product
{
  Name = p.Name,
  Category = c
};

and a LEFT JOIN query like this:

from p in db.Product
from c in db.Category.Where(q => q.CategoryID == p.CategoryID).DefaultIfEmpty()
select new Product
{
  Name = p.Name,
  Category = c
};

More samples are here

Creating your POCOs

In the previous example we assign an entire Category object to our product, but what if we want all the fields in our Product class, but we don't want to specify every field by hand? Unfortunately, we cannot write this:

from p in db.Product
from c in db.Category.Where(q => q.CategoryID == p.CategoryID).DefaultIfEmpty()
select new Product(c);

The query above assumes the Product class has a constructor that takes in a Category object. The query above won't work, but we can work around that with the following query:

from p in db.Product
from c in db.Category.Where(q => q.CategoryID == p.CategoryID).DefaultIfEmpty()
select Product.Build(p, c);

For this to work, we need a function in the Product class that looks like this:

public static Product Build(Product product, Category category)
{
  if (product != null)
  {
    product.Category = category;
  }
  return product;
}

One caveat with this approach is that if you're using it with composed queries (see example above) the select Build part has to come only in the final select.

Insert

At some point we will need to add a new Product to the database. One way would be to call the Insert extension method found in the LinqToDB namespace; so make sure you import that.

using LinqToDB;

using (var db = new DbNorthwind())
{
  db.Insert(product);
}

This inserts all the columns from our Product class, but without retrieving the generated identity value. To do that we can use InsertWith*Identity methods, like this:

using LinqToDB;

using (var db = new DbNorthwind())
{
  product.ProductID = db.InsertWithInt32Identity(product);
}

There is also InsertOrReplace that updates a database record if it was found by primary key or adds it otherwise.

If you need to insert only certain fields, or use values generated by the database, you could write:

using LinqToDB;

using (var db = new DbNorthwind())
{
  db.Product
    .Value(p => p.Name, product.Name)
    .Value(p => p.UnitPrice, 10.2m)
    .Value(p => p.Added, () => Sql.CurrentTimestamp)
    .Insert();
}

Use of this method also allows us to build insert statements like this:

using LinqToDB;

using (var db = new DbNorthwind())
{
  var statement = db.Product
                    .Value(p => p.Name, product.Name)
                    .Value(p => p.UnitPrice, 10.2m);

  if (storeAdded) statement.Value(p => p.Added, () => Sql.CurrentTimestamp);

  statement.Insert();
}

Update

Updating records follows similar pattern to Insert. We have an extension method that updates all the columns in the database:

using LinqToDB;

using (var db = new DbNorthwind())
{
  db.Update(product);
}

And we also have a lower level update mechanism:

using LinqToDB;

using (var db = new DbNorthwind())
{
  db.Product
    .Where(p => p.ProductID == product.ProductID)
    .Set(p => p.Name, product.Name)
    .Set(p => p.UnitPrice, product.UnitPrice)
    .Update();
}

Similarly, we can break an update query into multiple pieces if needed:

using LinqToDB;

using (var db = new DbNorthwind())
{
  var statement = db.Product
                    .Where(p => p.ProductID == product.ProductID)
                    .Set(p => p.Name, product.Name);

  if (updatePrice) statement = statement.Set(p => p.UnitPrice, product.UnitPrice);

  statement.Update();
}

You're not limited to updating a single record. For example, we could discontinue all the products that are no longer in stock:

using LinqToDB;

using (var db = new DbNorthwind())
{
  db.Product
    .Where(p => p.UnitsInStock == 0)
    .Set(p => p.Discontinued, true)
    .Update();
}

Delete

Similar to how you update records, you can also delete records:

using LinqToDB;

using (var db = new DbNorthwind())
{
  db.Product
    .Where(p => p.Discontinued)
    .Delete();
}

Bulk Copy

Bulk copy feature supports the transfer of large amounts of data into a table from another data source. For more details read this article.

using LinqToDB.Data;

[Table(Name = "ProductsTemp")]
public class ProductTemp
{
  [PrimaryKey]
  public int ProductID { get; set; }

  [Column(Name = "ProductName"), NotNull]
  public string Name { get; set; }

  // ... other columns ...
}

var list = new List<ProductTemp>();
// populate list

using (var db = new DbNorthwind())
{
  db.BulkCopy(list);
}

Transactions

Using database transactions is easy. All you have to do is call BeginTransaction() on your DataConnection, run one or more queries, and then commit the changes by calling CommitTransaction(). If something happened and you need to roll back your changes you can either call RollbackTransaction() or throw an exception.

using (var db = new DbNorthwind())
{
  db.BeginTransaction();
  
  // ... select / insert / update / delete ...

  if (somethingIsNotRight)
  {
    db.RollbackTransaction();
  }
  else
  {
    db.CommitTransaction();
  }
}

Also, you can use .NET built-in TransactionScope class:

// don't forget that isolation level is serializable by default
using (var transaction = new TransactionScope())
// or for async code
// using (var transaction = new TransactionScope(TransactionScopeAsyncFlowOption.Enabled))
{
  using (var db = new DbNorthwind())
  {
    ...
  }
  transaction.Complete();
}

Merge

Here you can read about MERGE support.

Window (Analytic) Functions

Here you can read about Window (Analytic) Functions support.

MiniProfiler

If you would like to use MiniProfiler or other profiling tool that wraps ADO.NET provider classes, you need to configure our regular DataConnection to use wrapped connection.

// example of SQL Server-backed data connection with MiniProfiler enabled for debug builds
public class DbDataContext : DataConnection
{
// let's use profiler only for debug builds
#if !DEBUG
  public DbDataContext() : base("Northwind")
  {
    // this is important part:
    // here we tell linq2db how to access underlying ADO.NET classes of used provider
    // if you don't configure those mappings, linq2db will be unable to use provider-specific functionality
    // which could lead to loss or unavailability of some functionality when profiled connection enabled
    MappingSchema.SetConvertExpression<ProfiledDbConnection,  IDbConnection> (db => db.WrappedConnection);
    MappingSchema.SetConvertExpression<ProfiledDbDataReader,  IDataReader>   (db => db.WrappedReader);
    MappingSchema.SetConvertExpression<ProfiledDbTransaction, IDbTransaction>(db => db.WrappedTransaction);
    MappingSchema.SetConvertExpression<ProfiledDbCommand,     IDbCommand>    (db => db.InternalCommand);
  }
#else
  public DbDataContext() : base(GetDataProvider(), GetConnection()) { }

  private static IDataProvider GetDataProvider()
  {
     // create provider instance (SQL Server 2012 provider in our case)
     return new SqlServerDataProvider("", SqlServerVersion.v2012);
  }

  private static IDbConnection GetConnection()
  {
     // create provider-specific connection instance. SqlConnection in our case
     var dbConnection = new SqlConnection(@"Server=.\SQL;Database=Northwind;Trusted_Connection=True;Enlist=False;");

     // wrap it by profiler's connection implementation
     return new StackExchange.Profiling.Data.ProfiledDbConnection(dbConnection, MiniProfiler.Current);
  }
#endif
}

More

Still have questions left? Check out our documentation site and FAQ

Issues

Quick list of the latest Issues we found

manoack

manoack

area: linq
Icon For Comments0

I wrote a WebApi OData server. But some OData filter create problem and I reduce the problem.

Steps to reproduce

Environment details

Linq To DB version: 4.2.0

Database (with version): localdb v15, using SQL2017 option

ADO.NET Provider (with version): Microsoft.Data.SqlClient 5.0.0

Operating system: 11

.NET Version: 4.8

MaceWindu

MaceWindu

area: scaffold
Icon For Comments1

I added a new configuration option myself to linq2db cli tool. I called it "sequence-name-prefix" and it is a part of dataModel json configuration file. I could also add a "sequence-name-suffix" option, to add a suffix to a table name. So, I suppose that a sequence is always called by convention [prefix][tablename][suffix] In my opinion you don't need to read sequences from database schema, since they are not connected in no way to a specific table on the database side, so that you alway should relay on some convention, to connect sequences and corresponding tables. SequenceNameAttribute is now derived from MappingAttribute also, which allows to use it with a fluent mapping as well.

If specified, this options addes SequenceName attribute to a primary column automatically for all tables in data context. Probaby you can build it the same way in a future release. Or I can try a pull request, if you have nothing agaisnt it. But I did everything in "master" now :)

My current problem is, I can't load interceptor dll from debugger, if I set LinqToDB.CLI as a startup project. Exactly the same interceptor dll works perfectly if used from dotnet linq2db cli tool installed globally.

Actually I only need no set Schema to null without my interceptor dll, then I don't need it really anymore.

I copied my interceptor dll to the same directory as my custom built dotnet-linq2db.exe, since linq2db.tools.dll must be the same, otherwise interceptor dll can't be recognized and loaded. Where should I place my interceptor dll if I want to use it as a command line argument, while debugging LinqToDB.CLI project in Visual Studio ? Here is what I'm getting now, while trying to load an interceptor dll from the same directory as a dotnet-linq2db.exe: image

P.S. Я прекрасно понимаю по-русски :) Спасибо за Вашу работу, наша контора использует Linq2Db годами, но в старой версии, еще под .Net Framework. Сейчас тестирую под .NET 6

Originally posted by @febus in https://github.com/linq2db/linq2db/discussions/3698#discussioncomment-3407033

MaceWindu

MaceWindu

type: bug
Icon For Comments10

@MaceWindu thank you for the quick response. When you say linq syntax, do you mean query expression syntax?

We have a case where (1) we are using the standard linq methods (no query syntax) and (2) the fields in the anonymous records are alphabetical.

image

We are getting the following error:

image

Stack trace:

Note that we are using linq2db together with EF Core (https://github.com/linq2db/linq2db.EntityFrameworkCore).

Originally posted by @jannesiera in https://github.com/linq2db/linq2db/issues/1813#issuecomment-1228304979

SleepyF0X

SleepyF0X

type: bug
Icon For Comments7

Discussed in https://github.com/linq2db/linq2db/discussions/3741

Originally posted by SleepyF0X August 26, 2022 I have part of code

Where dailyPurchases is CTE which grouped by ClientId

But when I include c.CompanyId in final select I got No coercion operator is defined between types 'System.Int32' and 'Domain.Client'. If I select only dp values all work fine, but I need a CompanyId

gpgpublickey

gpgpublickey

area: mapping
Icon For Comments17

How can I run a function returning a boolean in PLSQL using linq2db?

Steps to reproduce

Expected result: return a true or false based on result of the function execution

this is failing with invalid datatype, not sure how to achieve this, ExecuteProc doesn't work to run functions.

Thanks

MaceWindu

MaceWindu

type: feature
Icon For Comments0

Discussed in https://github.com/linq2db/linq2db/discussions/3737

Originally posted by Metadorius August 24, 2022 Hello!

I've been working on mappings for a big obfuscated DB (changing the column and table names is out of question because it's used by a lot of legacy applications) and wondered if there's any way to supply something like custom rename rules so that the CLI can process the names found in the database into something more human-readable. Ideally I would write something like a list of match-replace regexes for column names, table names, etc. in the JSON that the CLI tool is using. Is something like this currently possible to achieve or would be easy to implement?

dabide

dabide

area: linq
Icon For Comments3

Describe your issue

A simple update causes a lock wait timeout when mixing int and uint in the where clause.

Steps to reproduce

  1. Create a table with an unsigned int column bar
  2. Run the following query where the bar variable is an int:

This generates the following SQL:

I suspect that the Cast() makes MySQL try to lock the entire table, which in our case times out in production.

If LINQ to DB would have generated the following SQL, it would have worked:

Is there any reason that the columns are casted instead of the parameters?

Environment details

Linq To DB version: 4.1.0

Database (with version): MySQL 5.7

ADO.NET Provider (with version): MySqlConnector 2.1.10

Operating system: Any

.NET Version: 6

Jubast

Jubast

Icon For Comments0

This is a Question / Possible feature request

Usecase

I am working on introducing Linq2Db to our existing codebase. We are trying to support SQL based storage providers in our generic storage provider. This storage provider is currently being used to store values to Blobs, Files, NoSQL databases, etc.

The interface that is used looks like this:

The TModel is a POCO that will be inserted into the database. But we also need to add modelType and modelId to the table since that combination is the identifier of the entity (save as primary key). We can not add modelType or modelId to the TModel POCO.

I've attempted to do this via Sql.Property, DynamicColumnAccessor and AsyncLocal<> and got the solution working but its a really hacky way. Basicly when inserting i save TypeColumn and IdColumn to the AsyncLocal context, and then when linq2db requests data from DynamicColumAccessor i retrieve the data from the AsyncLocal context.

DynamicColumnAccessor setup

Is there a better way to do this?

Possible feature request if there is no better way of doing the above

Would it be possible to somehow provide values for inserts while writing the insert code? for example like here https://github.com/linq2db/linq2db/pull/964. But i would want to only provide values for dynamic columns, all other columns should be retrieved normaly from the POCO.

Idea code example from https://github.com/linq2db/linq2db/pull/964:

but beeing abile to call insert or replace

Environment details

Linq To DB version: 4.1 .NET Version: 6.0

devinlyons

devinlyons

area: linq
Icon For Comments6

Describe your issue

When I try to create a LEFT JOIN using groupJoin in F#, I get an exception. I know you aren't working on F# issues yet but if you could give me some guidance on solving this problem myself, that would be helpful.

Steps to reproduce

Here is the data context:

And the F# program:

Environment details

Linq To DB version: 4.1.1

Database (with version): N/A

ADO.NET Provider (with version): N/A

Operating system: Windows

.NET Version: 6.0

jods4

jods4

Icon For Comments0

Issue description

I noticed that using System.ComponentModel.DataAnnotations.Schema.TableAttribute on an entity makes all its properties non-columns unless they are annotated with [Column] as well.

This would be familiar to users of LinqToDb.Mapping.TableAttribute that has the same effect and exposes a property IsColumnAttributeRequired to control this behavior.

I think this is unfortunate:

  • It is inconsistent with EF, whose conventions still pick up columns when there's a [Table] on the class.
  • This makes it surprising and creates really perplexing bugs for EF users that try out linq2db.
  • The System attribute has no additional property to control this, so in my project where we share classes between EF and linq2db, those classes have to be annotated twice with both System and linq2db [Table], so that EF knows the table name and so that linq2db has the IsColumnAttributeRequired config. This feels redundant.
  • It's annoying / unrelated that because you need to specify an explicit table name, suddenly you must set an extra property or put extras attributes everywhere.
  • It's easy to forget and has bitten my co-workers more than once.

A new option, to avoid breaking changes

Of course just flipping the default would break all users that depend on the current implicit [NotMapped] behavior of properties, so that feels unacceptable.

I think a good way to improve the situation and keep compatibility is to introduce an option for this, maybe TableAttributeRequiresColumnAttributes, defined as true for back-compat. The metadata reader will use this value as the default for System [Table] and for undefined linq2db IsColumnAttributeRequired.

New users and users that want to can simply set this to false and have an experience more compatible with EF.

Long term simplification?

In the long term, as we add more options to improve linq2db without breaking back-compat, I'm thinking maybe we could create a "compatibility level" switch with all the recommended options for new development.

As an example, the C# nullable metadata support might end up behind another option, for compat reasons as well.

At one point we could have something like .SetCompatibilityLevel(40) and with this single call all options that were introduced for back-compat until version 4.0 are turned off.

I've seen Hangfire use a similar approach to create a "better" new configuration while remaining backwards-compatible.

YiamiYo

YiamiYo

type: bug
Icon For Comments4

The w3wp.exe of IIS has a smaller stack size and reaches the limit sooner than this example. I managed to overcome it by using editbin and setting the stack size to 2mb, but I don't think that's an ideal solution for production servers. Would it be possible to avoid the recursion when parsing/converting the expression?

Screenshot_2022-08-02 16 47 05_3Y3XWZ

Steps to reproduce

I've managed to reduce it to the essentials

Environment details

linq2db version: 4.1.1 Database Server: SQLExpress Database Provider: SqlServerTools Operating system: Windows 10 .NET Framework: 4.7.2

sd-mobile

sd-mobile

Icon For Comments0

I need write to log too long SQL queries. For detection of long SQL queries i use TraceInfo.ExecutionTime But i cant access to TraceInfo.SqlText, because i get ObjectDisposed exception.

It works on earlies versions LinqToDb (3.x).

Environment details

linq2db version: 4.1.1 Database Server: SQLite Database Provider: System.Data.SQLite Operating system: Win10 .NET Framework: NET6

jods4

jods4

Icon For Comments0

When using an association that is defined by an ExpressionPredicate, it looks like CanBeNull is ignored and the generated SQL is always an OUTER JOIN, even if you set CanBeNull = false.

We have no test for ExpressionPredicate + CanBeNull = false, so I added one in PR #3657. As expected, it fails.

(If you look at the test case and SQL carefully, you'll notice that because I don't use any association field (I just did a quick count(*)), the OUTER JOIN is dropped altogether. That's still a valid test case, as an INNER JOIN would not be dropped.)

weipingpingping

weipingpingping

Icon For Comments13

Describe your issue.

If you have a question, first try to check our documentation, especially FAQ and search in issues and discussions - maybe your question already answered there.

If you still have questions or want to start general discussion, please create new topic in discussions.

If you are seeing an exception, include the full exceptions details (message and stack trace).

c#

### Environment details linq2db version: 2.9.6 Database Server: mysql Database Provider: MySql.Data.dll Operating system: windows server 2016 .NET Framework: 4.8.1

sd-mobile

sd-mobile

area: types
Icon For Comments7

DATETIME(2)FROMPARTS not working in MERGE statement

Sql error Msg 10760, Level 16, State 1, Line 17 Scale argument is not valid. Valid expressions for data type datetime2 scale argument are integer constants and integer constant expressions.

Generated SQL query:

MERGE INTO [FIN].[agreement] [Target] USING ( VALUES ( '279f50cd-7e63-4169-ed20-12504ddbce50', DATETIME2FROMPARTS(2022, 7, 2, 0, 0, 0, 0, 0) ) ) [Source] ( [dbid], -- UNIQUEIDENTIFIER [signDate] -- DATETIME2(0) )

ON ([Target].[dbid] = [Source].[dbid])

WHEN MATCHED THEN UPDATE SET [Target].[signDate] = [Source].[signDate]

Environment details

linq2db version: 4.1 Database Server: SqlServer 2017 Database Provider: SqlServer.2012 or above Operating system: Windows 10 .NET Framework: 6.0

Maybe add the ability to disable the use of DATETIME(2)FROMPARTS?

MihailsKuzmins

MihailsKuzmins

Icon For Comments0

Currently in MergeWithOutputAsync it is only possible to output:

  • action: "inserted" or "updated"
  • result [INSERTED]
  • result [DELEETED]

In the SQL Server it is also possible to output [Source], so I am wondering would it be possible to add something like this? image

So it could look something like this. Required change is only for MergeWithOutputAsync.

And the generated SQL might look like this

If it is possible to consider this feature, let me know if some details are needed.

Environment details

linq2db version: 4.1.0 Database Server: SQL Server Database Provider: Microsoft.Data.SqlClient Operating system: Windows .NET Framework: .NET 6

darko1979

darko1979

type: improvement
Icon For Comments2

I have a use case where recently added SqlRow feature #3339 fits perfectly but there is an issue that I managed to solve but I'm not really satisfied with the solution. I created simplified example of this use case. Here is the database schema:

Here is the simple code that uses the SqlRow feature and it works.

Linq2db will generate proper SQL query:

What I needed is to have a list of SqlRows that is generated at runtime, so the first thing I tried was this:

This will return an error

Then I tried like this:

That will return a different error:

After trying different methods I came to a solution that builds custom expression but requires using LinqKit

This works properly and generates correct SQL query but I was hoping for a cleaner solution.

Environment details

linq2db version: 4.1.0 Database Server: PostgreSQL 14.1 Database Provider: Npgsql Operating system: Windows 10 .NET Framework: 6.0

MaceWindu

MaceWindu

status: investigation-required
Icon For Comments0

EagerLoadingTests.TestJoin, EagerLoadingTests.TestGroupJoin (and at least one more test there) generate incorrect join. E.g.:

INNER JOIN [MasterClass] [_mm] ON [key_data_result].[Id1] = [key_data_result].[MasterId] line should probably be INNER JOIN [MasterClass] [_mm] ON [key_data_result].[Id1] = [_mm].[MasterId]

Versions

Quick list of the latest released versions

v4.2.0 - Sep 01, 2022

Release highlights:

  • ClickHouse database support
  • PostgreSQL 15 support (MERGE and InsertOrReplace over MERGE)
  • SQL Server 2022 support (IGNORE NULLS qualifiers for FIRST_VALUE/LAST_VALUE window functions, [IS [NOT] DISTINCT FROM operator support)
  • BeforeReaderDispose(Async) command interceptor
  • Npgsql 7 support, NpgsqlInterval support from Npgsql 6
  • TransientRetryPolicy retry policy which use new DbException.IsTransient property (retry policy available only for net6.0+ TFMs)
  • support sequence name configuration for columns using fluent mapping
  • minor breaking change to type mapping, which could change how System.Convert.ToDecimal/ LinqToDB.Common.Convert converted to SQL for decimal types. See release notes for more details if you use those conversions
  • many other fixes and improvements

Scaffold CLI:

  • default-schemas option
  • [include/exclude]-[stored-procedures/scalar-functions/aggregate-functions] schema filtering options
  • fixed ColumnType not provided by schema for table error when scaffolding Access database using ODBC connection
  • many other fixes and improvements

Release notes

Nugets

API diff since v4.1.1

v4.1.1 - Jul 07, 2022

Release highlights:

  • bugfixes/improvements

Release notes

Nugets

API diff since v4.1.0

v4.1.0 - Jun 16, 2022

Release highlights:

  • support for Devart.Data.Oracle Oracle provider
  • various fixes/improvements

Release notes

Nugets

API diff since v4.0.1

v4.0.1 - May 27, 2022

Release highlights:

  • address CLI scaffold tool feedback
  • add support for text-based Guid mappings for SQLite
  • improve handling of date/time-related types for SQL Server

Release notes

Nugets

API diff since v4.0.0

v4.0.0 - May 19, 2022

There is no changes since RC2

Main changes since 3.7.0 release:

  • #2452: introduced query extensions API to extend queries with custom SQL (e.g. hints) at specific points. See more details here
  • #2643: added support for multiple active queries (MARS) over single connection (MySql and PostgreSQL not supported due to provider/database protocol limitations)
  • #2812: removed support for SQL Server 2000 dialect
  • #2927: added interceptors support to replace various properties, events and delegates mostly in DataConnection and make interceptors work with DataContext. Also check migration notes to see what members were replaced with interceptors and how to migrate your code to interceptors
  • #3098: released new database scaffold dotnet tool that will replace old T4 templates. See more details here
  • #3410: refactored remote context to support .NET Core and gRPC. See more details here
  • #3441: removed netcoreapp2.1 target framework (Linq To DB still usable from .NET Core 2.1 using netstandard2.0 TFM)
  • #3502: added DateOnly type support in mappings for all databases
  • #3536: added support for database packages
  • default SQL Server provider changed from System.Data.SqlClient to Microsoft.Data.SqlClient (you can use SqlServerTools.Provider property to revert this change)

Migration notes

Release notes for more details

Nugets

v4.0.0-rc.2 - May 15, 2022

This is last release candidate before final 4.0 release.

Release highlights:

  • #3502: DateOnly type support for all databases
  • #3534: data model customization support for scaffold utility and option to generate IEquality<T> implementation on scaffolded entities
  • #3536: Packages support
  • #3547: new Sql.NullIf function to generate NULLIF SQL function

Release notes

Nugets

API diff since v4.0.0-RC.1

v4.0.0-rc.1 - Apr 29, 2022

This is first of two planned release candidates before final 4.0 release.

Release highlights:

  • #2452: query extensions API to support generation of query hints and other custom SQL in different parts of SQL query plus a lot of ready-to-use extension methods for query hints for all databases
  • #2582: detection of changes to mapping schema to avoid issues with changes ignored due to cached queries
  • #3098: first preview of new dotnet tool for database scaffolding. Give your feedback here
  • #3410: remote context refactoring to support .net core, gRPC and user-defined transports
  • .net core 2.1 support dropped (linq2db still available there using netstandard2.0 build)
  • #3499: CTE support for SQL Server 2005 dialect
  • ~#3502: DateOnly type support for all databases~ due to nuget publishing error, feature wasn't released. Will be in RC2
  • many other bugfixes and small improvements, check full release notes for details

Release notes

Nugets

API diff since v4.0.0-preview.10

v3.7.0 - Apr 07, 2022

This is the last release in v3 branch. 4.0.0 release scheduled for later this month.

Release highlights:

  • #3076: traces now include transaction start, commit and rollback events
  • #3328: big rework of OUTPUT/RETURNING clause: fixed complex queries for SQL Server and added support for all other databases that support this clause (MariaDB, SQLite, PostgreSQL and Firebird) in INSERT, UPDATE, DELETE and MERGE statements (except query from RETURNING feature of PostgreSQL).
  • #3339: SQL ROW (tuple) support
  • #3461: PostgreSQL enums support
  • #3474: added missing/fixed support for Guid type in CreateTable and quieries for Firebird, DB2, SQL CE, Sybase ASE and Informix
  • #3357: add support for record class and other record-like types in union-like queries (SET queries, recursive CTE)
  • #3359: fixed generated SQL for recursive CTE with multiple non-anchor queries
  • #3478: added support for Net5.IBM.Data.Db2* and Net.IBM.Data.Db2* providers
  • many other bugfixes and small improvements, check full release notes for details

Release notes

Nugets

API diff since v3.6.0

v4.0.0-preview.10 - Apr 29, 2022

Release notes

Nugets

API diff since v4.0.0-preview.9

v4.0.0-preview.9 - Apr 29, 2022

Release notes

Nugets

API diff since v4.0.0-preview.8

v4.0.0-preview.8 - Apr 29, 2022

Release notes

Nugets

API diff since v4.0.0-preview.7

v4.0.0-preview.7 - Oct 14, 2021

Release notes

Nugets

API diff since v4.0.0-preview.6

v4.0.0-preview.6 - Oct 02, 2021

Release notes

Nugets

API diff since v4.0.0-preview.5

v3.4.5 - Oct 02, 2021

Release notes

Nugets

API diff since v3.4.4

v4.0.0-preview.5 - Sep 10, 2021

Release notes

Nugets

API diff since v4.0.0-preview.4

v3.4.4 - Sep 10, 2021

Release notes

Nugets

API diff since v3.4.3

v4.0.0-preview.4 - Apr 29, 2022

Release notes

Nugets

API diff since v4.0.0-preview.3

v3.4.3 - Aug 13, 2021

Release notes

Nugets

API diff since v3.4.2

v4.0.0-preview.3 - Apr 29, 2022

Release notes

Nugets

API diff since v4.0.0-preview.2

v3.4.2 - Jul 09, 2021

Release notes

Nugets

API diff since v3.4.1

v4.0.0-preview.2 - Jul 01, 2021

Release notes

Nugets

API diff since v4.0.0-preview.1

v3.4.1 - Jul 01, 2021

Release notes

Nugets

API diff since v3.4.0

v4.0.0-preview.1 - Apr 29, 2022

Release notes

Nugets

API diff since v3.4.0

v3.4.0 - Jun 03, 2021

Release notes

Nugets

API diff since v3.3.0

v3.3.0 - Apr 29, 2022

Release notes

Nugets

API diff since v3.2.3

Library Stats (Sep 03, 2022)

Subscribers: 160
Stars: 2.4K
Forks: 412
Issues: 360

dotnet-sshdeploy

here, otherwise you are in the right place

dotnet-sshdeploy

GraphQL Dotnet Parser

This library contains a lexer and parser classes as well as the complete GraphQL AST model

GraphQL Dotnet Parser

This dotnet extension is designed to clean-up the NuGet cache

(hopefully) temporary workaround for the @dotmorten as outlined in

This dotnet extension is designed to clean-up the NuGet cache

Dotnet client for Tarantool NoSql database

Some methods are not implemented yet because there are no direct analogs in IProto

Dotnet client for Tarantool NoSql database

dotnet-coverageconverter

coverage (binary format) files to

dotnet-coverageconverter

dotnet-stellar-sdk Stellar API SDK for

Report Bug · Report Security Vulnerability

dotnet-stellar-sdk Stellar API SDK for

dotnet-jwk is a JSON Web Key manager for dotnet

It allow to generate, encrypt, decrypt, convert and check JWK

dotnet-jwk is a JSON Web Key manager for dotnet

dotnet add package Brighid

Protecting the Client Secret

dotnet add package Brighid

dotnet-real-time-chat

A real time chat using C# dotnet and RabbitMQ

dotnet-real-time-chat
dotnet tool install --global dotnet-extract