-
-
Notifications
You must be signed in to change notification settings - Fork 134
Extra C# Driver Features
In general, the C# driver and Java driver are both API compatible. The Java documentation can be followed easily without much trouble.
The C# driver, however, offers additional language syntax that goes beyond the original Java driver. The additional language features are highlighted below.
Java and C#
//Java
r.table("marvel").getAll("man_of_steel").optArg("index", "code_name")
.run(conn);
//C#
R.Table("marvel").GetAll("man_of_steel").OptArg("index", "code_name")
.Run(conn);
Additionally, the C# driver supports indexer operators allowing the use of anonymous types as optional arguments.
R.Table("marvel").GetAll("man_of_steel")[new {index="code_name"}]
.Run(conn);
Java, Python, JavaScript and C#
r.table("marvel").get("12345")["field"]["subfield"] // Python
r.table("marvel").get("12345")("field")("subfield") // JavaScript
r.table("marvel").get("12345").getField("field").getField("subfield") // Java
R.Table("marvel").Get("12345").GetField("field").GetField("subfield") // C#
Additionally, the C# driver also supports .bracket()
via indexer operator:
R.Table("marvel").Get("12345")["field"]["subfield"] // C#
The Python, JavaScript, and Java drivers all expect the developer to know the shape of a query result. The C# driver follows a similar paradigm. However, supplying additional generic type information can help the DLR to perform a cast into a specific type.
new Foobar {id = "a", Baz = 4, Qux = 4}
// DLR dynamic
/* var result is 4 of type dynamic */
var result = R.Table("foobar").Get("a")["Baz"].Run(conn);
/* long result is 4 of type long */
long result = R.Table("foobar").Get("a")["Baz"].Run(conn);
// Give the compiler and run-time more type information with Run<T>()
/* int result is 4 of type int */
int result = R.Table("foobar").Get("a")["Baz"].Run<int>(conn);
Notice the result declarations long
and int
and their respective Run
and Run<T>
calls. The underlying deserializer, Newtonsoft.Json
determined the underlying type (without T
) is a long
. Given T
as int
, the deserializer can make a more specific deserialization of the result.
When the result of a query is a stream
(i.e., query → stream
), be sure to declare that the expected result as a Cursor<T>
. The following example below shows how to expect a cursor:
Cursor<int> result = R.Range(1,4).Run<int>();
Notice the DLR magic above, Run<int>
returns Cursor<int>
not int
. Since the server's response to the query above is a stream
it is the responsibility of the driver to return a cursor. Also note, in the example above, Run<T>
T
specifies the cursor item type.
There is a slight performance cost to Run<T>
since the execution context involves the DLR. As a best practice, queries that return a cursor should use the .RunCursor<T>()
run helper. .RunCursor<T>
bypasses the DLR and the execution context remains within the CLR type system to return a Cursor<T>
just like any normal CLR method call.
Cursor<int> result = R.Range(1,4).RunCursor<int>(conn);
foreach(var i in result){
Console.WriteLine(i);
}
/* Output:
1
2
3
*/
Note: A Cursor<T>
instance is not thread-safe. The Cursor<T>
items should only be enumerated over and consumed by a single thread. However, the underlying Connection conn
in the example above is thread-safe. Connection conn
can be used by multiple threads to send multiple queries to a RethinkDB server.
The C# driver offers additional run helpers that bypass the DLR and offer better query syntax and performance. See the Run Helpers page for more information.
Every .Run*()
method has a .Run*Async()
counterpart. For example:
var games = new[]
{
new Game {id = 2, player = "Bob", points = 15, type = "ranked"},
new Game {id = 5, player = "Alice", points = 7, type = "free"},
new Game {id = 11, player = "Bob", points = 10, type = "free"},
new Game {id = 12, player = "Alice", points = 2, type = "free"},
};
var result = await R.Db("mydb").Table("mytable")
.Insert(games)
.RunResultAsync(conn);
result.AssertInserted(4);
In the example above, var result = await
is used in conjunction with the .RunResultAsync
helper. The Task Parallel Library in .NET is used to await the query's response from the RethinkDB server. Async usage is the recommended approach for highly concurrent scenarios.
All async
methods support CancellationToken
. Cancellation pertains to the semantic wait operation of a pending network request. Cancellation does not mean the cancellation or rollback of an executing query on the RethinkDB server.
Additionally, in regards to Cursor<T>.MoveNextAsync()
, the CancellationToken
has no effect if the cursor still has buffered items to withdraw. Cancellation in this regard pertains to the wait operation of an outstanding network request for more Cursor
items.
Cancellation on a Cursor
's MoveNextAsync()
method is an enumerable safe operation. When a TaskCanceledException
is thrown by a CancellationToken
, the exception does not disrupt the ordering of cursor items. Also, since Cancellation does not cancel an already in-progress network request for more Cursor
items, future calls to MoveNextAsync
that are more patient will succeed at processing the network response, in due time and in order.
This C# driver supports POCO serialization to RethinkDB via Newtonsoft.Json
. The default serializer can be overridden by replacing the JsonSerializer
in the Converter.Serializer
static property.
RethinkDb.Driver.Net.Converter.Serializer = new JsonSerializer(/*custom*/);
Keep in mind, however, there are native types that RethinkDB expects in a specific JSON format. Native ReQL types like time, dates, binary data, and arrays (DateTime
, DateTimeOffset
, byte[]
, IEnumerable
) need to be in a specific JSON format over-the-wire in order to perform ReQL operations on them. By default, native ReQL types are converted automatically if the default Converter.Serializer
is used. More information about ReQL pseudo types can be found here.
Overriding the default Converter.Serializer
requires including the pseudo type converters when replacing the default serializer in order to maintain correct type conversions between native types and ReQL pseudo types. Instances of the JSON converters can be found in static properties of the Converter
class:
Converter.DateTimeConverter
Converter.BinaryConverter
Converter.GroupingConverter
Converter.PocoArrayConverter
Converter.PocoExprConverter
The example below shows how to insert and retrieve a User
POCO object into and from a table:
var user = new User
{
Name = "Brian",
Phone = "555.555.5555",
Birthday = new DateTime(1990, 8, 18, 0, 0, 0, DateTimeKind.Utc),
};
var result = R.Db("mydb").Table("mytable")
.Insert(user).RunResult(conn);
var pocoId = result.GeneratedKeys[0];
var poco = R.Db("mydb").Table("mytable")
.Get(pocoId).RunAtom<User>(conn);
By default, the primary key for JSON documents stored in tables is the id
field (case-sensitive). The default primary key id
can be changed during table creation by specifying primaryKey
optional argument.
There are two ways to generate primary keys for your documents: 1) client-side and 2) server-side.
For client-side key-generation, it is of particular importance to ensure the PascalCase Id property of a POCO object maps to a lower-case JSON id
field. For example:
public abstract class Document
{
[JsonProperty("id")]
public string Id { get; set; }
protected Document()
{
this.Id = Guid.NewGuid().ToString();
}
}
public class Person : Document
{
public string FirstName { get; set; }
public string LastName { get; set; }
}
Guid.NewGuid()
is used to generate client-side Ids for documents as shown above in the Document()
constructor.
If no id
primary field in the JSON document is specified upon insert, RethinkDB will generate a primary key for the JSON document. The primary key will be appended and stored with the document on the server side during insert. Newtonsoft can be configured to omit the id
property from your object when the document is being serialized if the Id property value matches null
or default
. The NullValueHandling
or DefaultValueHandling
on JsonProeprty
can be used to control the omission of the id
field when serializing the document upon insertion.
For example,
public abstract class Document
{
//Id as string (string reference type)
[JsonProperty("id", NullValueHandling = NullValueHandling.Ignore)]
public string Id { get; set; }
//Id as Guid (Guid value type)
[JsonProperty("id", DefaultValueHandling = DefaultValueHandling.Ignore)]
public Guid Id { get; set; }
// Id will be populated by RethinkDb
// check the "GenratedKeys" from the response after
// inserting objects.
}
public class Person : Document
{
public string FirstName { get; set; }
public string LastName { get; set; }
}
When the Person
document is retrieved via .Get
(or some other selection method) the Id field will not be empty.
The RethinkDb.Driver.Extras.Dao
namespace contains classes to help build Data/Document Access Object (DAOs) for applications. Consider the following:
//Domain Object
public class Person : Document<Guid>
{
public string FirstName { get; set; }
public string LastName { get; set; }
}
//Person DAO
public class PersonDao : RethinkDao<Person, Guid>
{
public PersonDao(IConnection conn, string dbName, string tableName)
: base(conn, dbName, tableName)
{
}
public Person FindUserByFirstName(string firstName)
{
return Table
.GetAll(firstName).OptArg("index", "firstNameIdx")
.Nth(0)
.RunAtom<Person>(conn);
}
}
And use:
var dao = new PersonDao(conn, "query", "test");
var person = new Person
{
FirstName = "Brian",
LastName = "Chavez"
};
var savedDoc = dao.SaveOrUpdate(person);
savedDoc.Id.Should().NotBeEmpty();
savedDoc.FristName = "Bryan";
var updatedDoc = dao.SaveOrUpdate(savedDoc);
Notice: All Domain Objects would derive from Document<IdT>
. Also, PersonDao : RethinkDao<T,IdT>
will contain basic operations such as Save
, Update
, Delete
, and GetById
. The derived DAO can contain custom methods that execute more complex ReQL queries such as FindUserByFirstName
in the example above.
Additionally, JObject support is first class. Insert and retrieval of a JObject are shown below:
var user = new JObject
{
["Name"] = "Brian",
["Phone"] = "555.555.5555",
["Birthday"] = new DateTime(1990, 8, 18, 0, 0, 0, DateTimeKind.Utc),
};
var result = R.Db(DbName).Table(TableName)
.Insert(user).RunResult(conn);
var objId = result.GeneratedKeys[0];
var jObjRaw = R.Db(DbName).Table(TableName)
.Get("myId").RunAtom<JObject>(conn)
Similar to POCO support the RethinkDb.Driver.Net.Converter.Serializer
is used when serializing JObject types.
The Java and C# driver allow format options when calling .Run(conn, runOpts)
. When the underlying raw $reql_type$
is desired the following format options will explicitly instruct the driver to leave $reql_type$
s as-is in JToken
derivative types (JObject
or JArray
):
-
time_format: 'raw'
- Leaves RethinkDB dates and times as$reql_type$:TIME
. -
binary_format: 'raw'
- Leaves RethinkDB binary as$reql_type$:BINARY
. -
group_format: 'raw'
- Leaves RethinkDB grouped data as$reql_type$:GROUPED_DATA
.
To bypass the C# pseudo type conversion and leave $reql_type$
pseudo types as is, simply declare your expected types as JObject
(or JArray
depending on your result) and specify a format option. For example,
JObject result = R.Now().Run<JObject>(conn, new {time_format = "raw"});
{
"$reql_type$":"TIME",
"epoch_time":1462248375.766,
"timezone":"+00:00"
}
result["$reql_type$"].ToString().Should().Be("TIME");
The C# driver supports the use of anonymous types in place of POCOs. For example, inserting an anonymous type into a table:
var user = new {
Name = "Brian",
Phone = "555.555.5555",
Birthday = new DateTime(1990, 8, 18, 0, 0, 0, DateTimeKind.Utc),
};
var result = R.Db(DbName).Table(TableName)
.Insert(user).RunResult(conn);
Similar to POCO support the RethinkDb.Driver.Net.Converter.Serializer
is used when serializing anonymous types.
The following is only possible with C#
public class TopPlayer
{
public int PlayerId { get; set; }
}
var games = new[]
{
new Game {id = 2, player = "Bob", points = 15, type = "ranked"},
new Game {id = 5, player = "Alice", points = 7, type = "free"},
new Game {id = 11, player = "Bob", points = 10, type = "free"},
new Game {id = 12, player = "Alice", points = 2, type = "free"},
};
List<TopPlayer> result =
R.Expr(games)
.Filter(g => g["points"].Gt(9))
// Anonymous type projection to shape the result
// to fit into TopPlayer
.Map(g => new { PlayerId = g["id"] })
.Run<List<TopPlayer>>(conn);
result.Dump();
result.ShouldBeEquivalentTo(new[]
{
new TopPlayer {PlayerId = 2},
new TopPlayer {PlayerId = 11}
});
Tip: Use nameof
in C#6 to maintain refactorable queries:
.Filter(g => g["points"].Gt(9))
.Filter(g => g[nameof(Game.points)].Gt(9))
Suppose we have the following async
method that consumes a change feed on a chat
table. Assume a single thread named Thread A enters HandleUpdates()
:
1 public static async Task HandleUpdates()
2 {
3 var conn = R.Connection().Connect();
4 var feed = await R.Db("test").Table("chat")
5 .Changes().RunChangesAsync<ChatMessage>(conn);
6
7 foreach (var message in feed){
8 Console.WriteLine($"{message.User}: {message.Text}");
9 }
10 }
Thread A on Line 4 asynchronously awaits on the establishment of a Cursor
object that can be readily consumed by the foreach
loop synchronously. Line 4 and Line 7-9 do not asynchronously consume changes on a per item basis.
Thread A begins iteration over the feed
in the foreach
loop as shown in Line 7-9 above. Each change iteration over feed
is a synchronous operation for Thread A. If there are no changes yet to be consumed the iteration is blocking on Line 7 in the example above.
If asynchronous consumption of change feed items is desired, the iteration must be driven manually as there is no IEumerableAsync
pattern in C#/.NET. Asynchronous consumption of change feed items is achieved by using a while
loop and await MoveNextAsync()
as shown below:
1 public static async Task HandleUpdates()
2 {
3 var conn = R.Connection().Connect();
4 var feed = await R.Db("test").Table("chat")
5 .Changes().RunChangesAsync<ChatMessage>(conn);
6
7 while (await feed.MoveNextAsync()){
8 var message = feed.Current;
9 Console.WriteLine($"{message.User}: {message.Text}");
10 }
11 }
See the Reactive Extensions integration for a more elegant solution to consuming change feeds.
Reactive Extensions (Rx) work well with this driver. Below is an example of how to subscribe to a change feed using Rx:
var changes = R.Db("marvel").Table("heroes")
.Changes()
.RunChanges<Hero>(conn);
var observable = changes.ToObservable();
observable.Subscribe(OnNext, OnError, OnCompleted);
Projects using Rx need to reference System.Reactive using NuGet.
More detailed examples can be found here.
The following is only possible with C#
//Objects inside Foobar table:
new Foobar {id = "a", Baz = 1, Qux = 1}
new Foobar {id = "b", Baz = 2, Qux = 2}
new Foobar {id = "c", Baz = 3, Qux = 3}
var exprA = R.Table("foobar").Get("a")["Baz"]; // 1
var exprB = R.Table("foobar").Get("b")["Qux"]; // 2
int result = (exprA + exprB + 1).Run<int>(conn);
// Everything between (...) executes on the server
// and returns result 4.
The last line, (exprA + exprB + 1)
is converted into an AST and sent to the server for evaluation including the + 1)
part. The + 1)
is not evaluated on the client. Here's what happens:
The compiler/run-time knows exprA
is type ReqlExpr
, moves right to exprB
(also of type ReqlExpr
), applies the +
operator overload for adding two ReqlExpr
s who's sum is also ReqlExpr
(under the hood, all we're doing is exprA.Add(exprB)
). Lastly, the evaluation moves right again to the last + 1)
but encounters an int
type. The implicit conversion operator kicks in (int -> ReqlExpr)
and converts int
into a Datum(1)
(which inherits from ReqlExpr
). Finally, the last + Datum(1)
can be evaluated. The final equivalent ReQL sequence is: exprA.Add(exprB).Add(new Datum(1))
. Beautiful. ❤️
The benefit of implicit conversion and operator overloads is better language integration. For example, both Filter
s are equivalent:
.Filter(g => g[nameof(Game.points)].Gt(9))
.Filter(g => g[nameof(Game.points)] > 9)
The following is only possible with C#.
Sometimes it's useful to serialize a ReQL expression across an application boundary. To do so, use the ReqlRaw
pseudo AST type and it's associated ReqlRaw.ToRawString
method as shown below:
//Objects inside Foobar table:
new Foobar {id = "a", Bar = 1, Qux = 1}
new Foobar {id = "b", Bar = 2, Qux = 2}
new Foobar {id = "c", Bar = 3, Qux = 3}
// Create a detached filter function
ReqlFunction1 filter = expr => expr["Bar"].Gt(2);
// Convert it to a string representation
string filterSerialized = ReqlRaw.ToRawString(filter);
Transmit the filterSerialized
string
over a network, into outer space, store it in a database, inside a config file, or across any application boundary. Homie don't care 😺. Bring the ReQL expression back to life for use in an executable query by calling ReqlRaw.FromRawString(string)
as shown below:
var filterExpr = ReqlRaw.FromRawString(filterSerialized);
var result = R.Db("MyDb").Table("Foobar").Filter(filterExpr).Run(conn);
result.Dump();
/* OUTPUT:
[
{
"id": "c",
"Bar": 3,
"Baz": 3,
"Idx": "qux",
"Tim": null
}
]
*/
Take it a step further and serialize the whole query! Holly cow 🐮 batman!
private ReqlExpr IsForbidden(ReqlExpr x)
{
return R.Expr(R.Array(1, 2, 3)).Contains(number => number.Eq(x));
}
//Pick numbers that are not 1, 2, or 3 in the sequence 5, 4, 3.
var query = R.Expr(R.Array(5, 4, 3)).Filter(n => IsForbidden(n).Not());
var queryString = query.ToRawString();
Transmit the queryString
over a network, into outer space, store it in a database, inside a config file, or across any applicaiton boundary. Hydrate and bring the query back to life:
var queryAst = R.FromRawString(queryString) // So far, numbers 5 and 4
.Filter(x => x.Ge(5)); //And Filter Again!
var result = queryAst.Run(conn);
result.Dump();
/* OUTPUT:
[5]
*/
Let's see Entity Framework do that! Ahhh, Ha! 😺
Happy ReQL-ing! 🚀
- Home
- Query Examples
- Logging
- Connections & Pooling
- Extra C# Features
- GOTCHA Goblins!
- LINQ to ReQL Provider
- Differences
- Java ReQL API Documentation