Oren Eini

CEO of RavenDB

a NoSQL Open Source Document Database

Get in touch with me:

oren@ravendb.net +972 52-548-6969

Posts: 7,590
|
Comments: 51,219
Privacy Policy · Terms
filter by tags archive
time to read 1 min | 143 words

I have been head down in NHibernate for the last several months, and as a result, I think that I started to miss just how enabling Active Record really is. This weekend I have been working on with it with a venegance, and I love it. The NHibernate 1.2 integration is really important, because it allows to infer even more stuff!

Allow me to present, in all its glory, the amount of stuff needed to make a class persistent:

(Image from clipboard).png

The collapsed properties merely hide the get/set clutter, by the way.

This is enough information for Active Record (along with the other classes defined in the same way, to allow me to work against the model immediately after creating it!

SQL Refactor

time to read 12 min | 2301 words

So I got the beta of SQL Refactor, opened SQL Management Studio and liked what I saw:

(Image from clipboard).png

The first thing that I noticed, more important than anything else, is that it has a SQL Beautifier. This is the first I have seen this for T-SQL (there seems to be a lot for Oracle, though), and the first where the beautifier is integrated directly into SQL Management studio.

(Image from clipboard).png

Why is this important? Well, check this out (from Northwind):

ALTER procedure [dbo].[Employee Sales by Country]

@Beginning_Date DateTime, @Ending_Date DateTime AS

SELECT Employees.Country, Employees.LastName, Employees.FirstName, Orders.ShippedDate, Orders.OrderID, "Order Subtotals".Subtotal AS SaleAmount

FROM Employees INNER JOIN

      (Orders INNER JOIN "Order Subtotals" ON Orders.OrderID = "Order Subtotals".OrderID)

      ON Employees.EmployeeID = Orders.EmployeeID

WHERE Orders.ShippedDate Between @Beginning_Date And @Ending_Date

And after applying the formatting:

ALTER procedure [dbo].[Employee Sales by Country]

    @Beginning_Date DateTime,

    @Ending_Date DateTime

AS

    SELECT

        Employees.Country,

        Employees.LastName,

        Employees.FirstName,

        Orders.ShippedDate,

        Orders.OrderID,

        "Order Subtotals".Subtotal AS SaleAmount

    FROM

        Employees

    INNER JOIN

      (

        Orders INNER JOIN

        "Order Subtotals"

        ON Orders.OrderID = "Order Subtotals".OrderID

      )

        ON Employees.EmployeeID = Orders.EmployeeID

    WHERE

        Orders.ShippedDate Between @Beginning_Date And @Ending_Date

I can actually read that. And the formatting is configuration enough that in 30 seconds I got just the way I wanted. Okay, so I am gushing over a code formatter, not very exciting, until you realize that I have to read thousnads of SQL Statements that were just off. The other stuff on the menu looks very interesting as well.

Expand wildcards means moving from "select * from foo" to a proper statement. Qualify names seems to work on the schema level, although I expected it to work on the columns as well. (Meaning that it turned Employees to [dbo].[Employees], but didn't changed FirstName to Employees.FirstName).

Moving right along, I looked at the script summarizer:

(Image from clipboard).png

It can give a overview of a script, and take you directly into parts of it. Then you can use the Encapsulate Stored Procedure to extract stuff out. I like the way the wizards for the actions are setup, so I can move back and forth between them, and the commentry is excellent (the screen shot below if from split table refactoring):

(Image from clipboard).png

Finishing thoughts:

  • It generate scripts that would modify the DB, and the whole eperiance is... respectful is not quite the word I am looking for here. Too often I see a tool that is dumbing down ideas, and I get annoyed. Check out this UI, I don't think that I can explain it better now (now being 3AM here):
    (Image from clipboard).png
  • I wonder how this would deal with a big DB (4000+ tables, hundreds of SPs, views, etc) when working on it remotedly.
  • Missing finishing touches:
    • There is no integration with right mouse click on the text editor, and it is annoying.
    • No accesselrator key for the menu, more annoying.

It looks very nice, and as you can see, what excites me the most is probably what the guys at Red Gate spent the least time on :-). When I hear the term Database Refactoring, I think about moving tables around, Add Column, Move Column, etc. I didn't consider that refactoring in this sense will be improvements to the scripts themselves. In retrospect, that seems very obvious.

I know that already SQL Prompt has changed the way I work with databases, and I think that this has a similar potential. I know of one thing that I would like to put it to use already. Rename refactor on all the tables in the database, From Taarich to Date. I work with databases quite a bit, although I am not a DBA, and from a ten minute run, I know of several places where I would want to know that. (I would have given a lot to have this nine months ago, when I was doing the Big SQL Project).

time to read 5 min | 871 words

In my previous post about NHibernate and Stored Procedures, I showed how it can be done, and I closed of with this:

The disadvantages - You take away from NHibernate the ability to comprehend the structure of the tables, this mean that it can't do joins, eager fetching, etc. This is extremely important capability that just vanished. Likewise for queries, NHibernate's abilities to query your objects is being severly limited using this method of operation.

Galen commented:

It sounds like the following two architectural decisions are mutually exclusive:
1. Require stored procedures for all data access
2. Use NHibernate

Is there an effective way to do both?

We have a particularly large project (3 year, multi-million) on which we know we need code generation.  Our choices are down to
a) NHibernate/MyGeneration or
b) Roll our own code generation using CodeSmith templates. 

We want to use NHibernate, but our data architects are unhappy about losing the "require stored procedures" battle.

I won't get into the issues I have with "SP for everything", I already expressed them (at some length), elsewhere.  NHibernate is a great tool for abstracting the database, and calling Stored Procedures is a good step forward, mainly because some issues are best solved with them.

The main problem with stored procedures is that they rob NHibernate one of its most important advantages, its flexibility. If you want to do an arbitrary join between three entities and sort by a forth, you can do it very easily. This opens up a lot of options that you just can't have with SP. Using SP, if the SP doesn't allow it, you just can't do it. And any time you have a new need you have to either create a new SP or add parameters, and I am not sure which is worse.

Here is what I would recommend, assuming that the "Give Me Stored Procedures Or Give Away The Database" mindset is prevasive. Do all selects from views (preferably thin views), and all CUD via stored procedures. I had quite a success with using views for NHibernate, and I don't really care how NHibernate is writing stuff to the database. That is going to always be fairly simple insert/update/delete statement, and as long as the SP keeps the usual semantics, everything is going to be fine.

In most cases, this makes the DBAs happy, since then they get to deny everyone direct access to the databases, and they control both the views and the SP. However, this also means that NHibernate is mostly free to join between the views, so it doesn't lose any of its power.

Take into account that writing the SP and wiring NHibernate to use them is still not simple, but you get all the power of NHibernate and the DBAs are happy because they get to keep all of the control in their hand and can optimize the database to their heart's content.

Now, if you can't get your data architects to accept that, it is a problem.

Would I use NHibernate is a situation where SP are really the only way to use the database? Yes. But you should read this statement with the qualifier that says "I really like NHibernate and I know what I'm doing there." There is a lot of value that NHibernate brings to the table quite outside from the flexiblity of working with the data model (just the Unit Of Work is a huge issue for me, and I could gush on and on and on about it... hm, actually, I do ). But, I don't have any experiance in working with NHibernate in a SP-only manner. (And if I have any say in the matter, I won't :-) ). It might be better to try asking the Java guys that has done it before.

I would highly recommend against rolling your own, for reasons already mentioned.

time to read 4 min | 700 words

I have a piece of code that has to calculate some pretty hefty stuff over a large amount of data. Unfortantely, that large amount of data took large amount of time to load. By large amount I mean, I walked away and had time for a coffee, chit chat, about three phone calls and a relaxing bout of head banging, and it still continued to pry into the database, and likely would continue to do so until the end of time or there about.

This calculation has two main charactaristics:

  1. It is vital to several core functions of the system.
  2. It is very highly preferred to make this calculation on the fly. Doing it on the backend is possible, but will cause a lot of major complications.

So, in the middle of checking the price of a dozen new servers (and a safe place in Nigeria, once the client hear about this), it occur to me that while pre-mature optimization is evil, maybe optimization itself has some value and that Nigeria might have to wait for another day.

After carefully thinking about the scentific process (i.e: observing the occurances, forming a theory, preparing experiments, proving a theory, arguing for a decade about what it means, etc...) I decided to take a more direct approach and looked at what I was trying to do.

Then I added this to the query:

left

join fetch e.Rules rules

And I re-run the whole thing. The performance benefit was four orders of magnitude. And by that I mean that the page is still very heavy DB wise (around ~50 queries on empty cache, which is my usual benchmark), it actually complete in real time, and all the rest of that stuff are things that are very easy to solve (grab those three pieces of code in one shot, instead of five, etc).

Of course, I then had to spend about half an hour staring at the generated query and think about what it was doing (there was a CROSS JOIN there that scared me) before coming to the conclution that it really was a good to fetch all that data. Well, almost. There should have been around ~6000 rows returned from this query, but only 2 were returned.

After a long bout of head scratching, I determained that the fault was at my mapping. I had several places where I had where clauses like this one:

where="Cancelled = 0"

I'm pretty sure that you can see where this is going. In a left join scenario, this (non-nullable) column is going to be null, so it would evaluate to false, making the join into an inner join, reducing the returned data by quite a bit. I'm writing this post as I go along fixing this issue. Right now the situation is not much improved :-(

After a long and hard battle, I managed to drop it merely three order of magnitude down, and I run into some issues with the code that uses it, so I need to fix those first.

I'll try to post some ideas about how to solve the complex SELECT N+(M*N*Zm)+1 issues (where N is the number of items, M is the number of collections in each item, and Zm is the number of items in each collection in each item). (The short version, don't do this.)

time to read 1 min | 141 words

No, I'm not going to talk about the hard to create/understand queries, performance, or maintainability. The really bad side of composite keys is that you get assoications using part of a key. Here is an example:

(Image from clipboard).png

Now, start working with the associations here. Payments are linked to salaries that falls within their date range and belong to the same employee. Needless to say, this breaks down FK support, so you will end up with bad data that you need to clean. No to mention that they queries that you need against this database are horrendous.

But, the wost thing that I saw was a relation between two tables that relied on the last N left-most unique characters in a string key.

time to read 22 min | 4297 words

Often, when introducing NHibernate, I need to integrate with existing database and infrastructures. Leaving aside the question of stored procedures (since I already expanded on that in length here), which are avialable on NHibernate 1.2, I want to focus on using SQL Functions here.

(One again, I'm back to the Blog -> Posts -> Comments model)

Now, there are four types of SQL Functions that you might want to use:

  • Scalar functions that are a part of an entity.
    A post's total  spam score may be calculated using a SQL Function, and it is part of the properties of the object.
  • Scalar functions that are used for calculations, and should be called explicitly.
    A blog's popularity score may be calculated using a SQL Function, but it is too expensive to calculate and not often needed.
    Note: Only this requires NHibernate 1.2, all other features can be done using NHibernate 1.0.2
  • Table valued functions (or stored procedure, for that matter, but that is a bit harder) that return entities:
    A selection of posts with specific spam score it one example.
  • Scalar functions that you want to use as part of your HQL queries.
    For instnace, you may want to use database (or user defined) functions as part of your HQL queries. Think lower(), dbo.postSpamScore(), etc.

Let us attack each of those in turn, shall we?

First, we have a scalar function that is a property of the entity, in this instance a post' spam score. The SQL Function is defined so:

CREATE FUNCTION GetPostSpamScore ( @postId INT )

RETURNS INT AS BEGIN

      RETURN 42

END

Not very exciting, I know, but for our purposes, it is enough. Now, I need to define the following in the mapping file:

<

property name='SpamScore' formula='dbo.GetPostSpamScore( post_id )'/>

The formula attribute is very powerful, you can even put SQL statements that will be executed as corelated sub queries (if you database supports it).

Q: Hi, what about aliasing? If I use this and join against another table that has a post_id (for instnace, the comments table), won't I get an error or unpredictable results?
A: No, NHibernate will automatically pre-pend the alias of the current entity table to anything that looks like a unqualified column access. If you need a column from another table, make sure to use the column with the qualifying name.

That is all you need to do to get the value from the function into your entities. In fact, you can now even perform HQL queries against this property, like this:

from Post p where p.SpamScore > 50

Second case, we want to get a scalar result from the database, and we are not interested in using ADO.Net directly to do so. Note that this is only possible with NHibernate 1.2, in NHibernate 1.0.2, you will need to use ADO.Net calls directly. Again, the SQL Function is very simple (for demonstration only, you can do anything you want in the SQL function, of course)

CREATE FUNCTION GetBlogTotalSpamScore ( @blogId INT )

RETURNS INT AS BEGIN

      RETURN 42

END

Now, let us map this query so it would be easy to use...

<sql-query name='BlogSpamScore'>

       <return-scalar column='SpamScore' type='System.Int32'/>

       <![CDATA[

              select dbo.GetBlogTotalSpamScore(:blog) as SpamScore

       ]]>

</sql-query>

And the code that uses this:

int spamScore = (int)session.GetNamedQuery("BlogSpamScore")

       .SetEntity("blog", blog).UniqueResult();

A couple of points here. Notice that we are using the NHibernate notations for parameters in the query (:blog), and that we are passing an object to the query, not the identifier. I find this style of coding far more natural and OO than the equivalent ADO.Net code. For that matter, the equivalent ADO.Net code goes on for half a page or so... :-)

Third case, using a table valued function. Table valued function obviously cannot be used as a property of the entity. (To be exact, they can be used as a collection, but this is advanced functionality, and it is only avialable in NHibernate 1.2, so I'm not not going to cover it here). Fairly often, you want to use this for getting a collection of entities from the database using logic that reside in the database (for performance or historic reasons).

Here is my demo function:

ALTER FUNCTION GetAllPostsWithSpamScoreOfAtLeast( @minSpamScore int)

RETURNS TABLE AS RETURN

(

      SELECT post_id, post_blogid FROM Posts

      -- implement WHERE clause here

);

Now, let us map this for a set of entities:

<sql-query name='AllPostsWithSpamScoreOfAtLeast'>

       <return class='NHibernate.Generics.Tests.Post, NHibernate.Generics.Tests'

                     alias='post'/>

       <![CDATA[

              SELECT post_id as {post.PostId}, post_blogid as {post.Blog}, 100 as {post.SpamScore}

              FROM dbo.GetAllPostsWithSpamScoreOfAtLeast(:minSpamScore)

       ]]>

</sql-query>

Notice that I map all the colums to their respective properties (I could also use {post.*} if I wanted all of the columns without bother with specifying each and every one). And that I can pass parameters easily to the query.

Now, I can just query them like this:

IList list = session.GetNamedQuery("AllPostsWithSpamScoreOfAtLeast")

       .SetInt32("minSpamScore", 5).List();

Against this is much more readable than the equivalent ADO.Net code, and I get full fledged objects back. If you want to do projections (show a post summary, etc. Using just part of the entity) you need to map the resulting projection as if it was an entity, NHibernate doesn't support SQL projections. (In practice, this is not an issue).

The last case is using user defined SQL Functions (or bulitin ones that NHibernate doesn't recognize by default) in your HQL queries. In order to do this, you need to either extend the dialect for your database, or dynamically add new functions to the dialect in the session factory (not really recommended). 

Let us assume that you really want the ISOweek function. (See here for the implementation, if you really care). The function declaration is:

CREATE FUNCTION dbo.ISOweek (@DATE datetime) RETURNS INT

I would recommend extending the dialect for the database with the new function, like this:

public class MyDialect : MsSql2000Dialect

{

       public MyDialect()

       {

              RegisterFunction("dbo.isoweek", new StandardSQLFunction(NHibernateUtil.Date));

       }

}

A couple of things to note here. If this is a user defined function, you have to add the schema (in this case, dbo). The function name must be in lower case (HQL is case insenstive, so you can use whatever case you like in the queries, but you register the function with lower case). You then configure NHibernate to your MyDialect instead of your database dialect, and that is it. You can now issue HQL queries like this:

from Post p where dbo.ISOWeek(p.date) = 51

Note that you must still use the "dbo." in the HQL.  Again, you can pass arguments naturally:

session.CreateQuery("from Post p where dbo.ISOweek( :date )  = 12")

.SetDateTime("date", DateTime.Now)

       .List();

(Not that I can't think of a reason why you would want to excute the last two queries).

So, this is just about all you would ever want to know about SQL Functions, and if you kept reading up to now, you are really passionate about it.

Happy (N)Hibernating... :-)

time to read 2 min | 293 words

One of the nicest parts of developing with NHibernate is that you can get NHibernate to generate the table from the mapping. This is extremely useful during development, when changes to the domain model and the data model are fairly common. This is even more important when you want to try something on a different database. (For instnace, you may want to use a SQLite database during testing / development, and SQL Server for production, etc). I intend to post soon about unit testing applications that uses NHibernate, and this functionality is a extremely important in those cases.

This functionality has just one issue, unless you explicitly specify the table name, it uses the name of the class as the name of the table, this tends to give me ticks. Mostly because I am used to thinking about tables in plurals. A table named Employee is an anatema, a table named Employees is all right.

Take a look at this simple model:

(Image from clipboard).png

Creating the mapping for it is not hard, just tedious at times. Creating the mapping and then creating the tables is just boring, and the default NHibernate naming rules are not acceptable for me. I decided to take a leaf from Ruby On Rail (actually, I robbed the MonoRail generator for the source code, and converted it from Boo to Ruby) and create an Inflector and a pluralizing naming strategy. Now, when I generate the table structure from the mapping, I get the following database structure:

(Image from clipboard).png

Now this is much nicer.

 

time to read 4 min | 660 words

After last night's post about the performance benefits of SqlCommandSet, I decided to give the ADO.Net team some headache, and release the results in a reusable form.

The relevant code can be found here, as part of Rhino Commons. Beside exposing the batching functionality, it is very elegant (if I say so myself) way of exposing functionality that the original author decided to mark private / internal.

I really liked the declaration of this as well:

[

ThereBeDragons("Not supported by Microsoft, but has major performance boost")]
public class SqlCommandSet : IDisposable

The usage is very simple:

SqlCommandSet commandSet = new SqlCommandSet();

commandSet.Connection = connection;

for (int i = 0; i < iterations; i++)

{

       SqlCommand cmd = CreateCommand(connection);

       commandSet.Append(cmd);

}

int totalRowCount = commandSet.ExecuteNonQuery();

As a note, I spiked a little test of adding this capability to NHibernate, and it seems to be mostly working, I got 4 (out of 694) test failing because of this. I didn't check performance yet.

time to read 3 min | 490 words

I have ranted before about the annoying trend from Microsoft, to weld the hood shut in most of the interesting places. One particulary painful piece is the command batching implementation in .Net 2.0 for SQL Server. The is extremely annoying mainly because the implementation benefits are going for those who are going to be using DataSets (ahem, not me), but are not avialable to anyone outside of Microsoft. (See topic: OR/M, NHibernate, etc).

Today, I have decided to actually check what the performance difference are all about. In order to do this, I opened the (wonderful, amazing) Reflector and started digging. To my surprise, I found that the Batching implementation seems to be centralized around a single class, System.Data.SqlClient.SqlCommandSet (which is internal, of course, to prevent it from being, you know, useful).

Since the class, and all its methods, are internal to System.Data, I had to use Reflection to pry them out into the open. I noticed that the cost of reflection was fairly high, so I converted the test to use delegates, which significantly imporved perfromance. The query I run was a very simple query:

INSERT

INTO [Test].[dbo].[Blogs] ([blog_name]) VALUES (@name)

With the @name = 'foo' as the parameter value. The table is simple Id (identity), Blog_Name (nvarchar(50))

Note: Before each test, I truncated the table, to make sure it is not the additional data that is causing any slowdown.

The Results:

(Image from clipboard).png

The X axis is the number of inserts made, the Y axis is the number of ticks that the operation took. As you can see, there is quite a performance difference, even for small batch sizes. There is a significant difference between batching and not batching, and that reflection / delegates calls are not a big cost in this scenario.

Here is the cost of a smaller batch:

(Image from clipboard).png

This shows a significant improvement even for a more real-world loads, even when we use Reflection. 

I just may take advantage of this to implement a BatchingBatcher for NHibernate, it looks like it can make a good benefit for perfromance. Although this will probably not affect SELECT performance, which is usually a bigger issue.

You can get the code here: BatchingPerfTest.txt

time to read 10 min | 1984 words


One of the biggest problems with abstractions is that they may allow you to do stupid things without them being obvious. In OR/M-land, that usually means SELECT N+1 issues.
The problem is that you often develop a certain functionality first, and only then realize that while you tested, all was fine and dandy on the five items that you had, but on the real system, you have 5,000, and the DBA is on its way to ER...

Anyway, I am currently working with Web Applications, and I wanted to get a good indication about what pages are troublesome.
Being who I am, I immediately began to design a framework that would correlate page requests to trace data from SQL Server, and then another system that would analyze it and spit out a report saying: "Wow, that ScaryReport.aspx page is making 30% of the calls in the application, take a look at that".

Not wishing to spent the next two years on this project, I decided to do something a bit more modest, and utilize the already existing infrastructure.
In this case ,the infrastructure is NHibernate and lot4net.

The secret tidbit is that NHibernate logs all queries to a log4net logger named "NHibernate.SQL". From there, it was a simple matter of adding a logging helpers to Rhino Commons that would output the current page and the current request id (the request hash code, basically).
Then, It was a matter of defining the following table:

CREATE TABLE [dbo].[NHibernatePerPageQueries](
   
[Id] [int] IDENTITY(1,1) PRIMARY KEY NOT NULL,
   
[RequestId] [int] NOT NULL,
   
[Date] [datetime] NOT NULL,
   
[Message] [nvarchar](max)  NOT NULL,
   
[PageURL] [nvarchar](max)  NOT NULL
)


Then, to define the appender:

<appender name="NHibernatePerPageAppender"
            
type="log4net.Appender.AdoNetAppender">
    
<
bufferSize value="10" />
    
<
connectionType value="System.Data.SqlClient.SqlConnection,
System.Data, Version=2.0.0.0, Culture=neutral,
PublicKeyToken=b77a5c561934e089
" />
    
<
connectionString value="Data Source=localhost;Initial
Catalog=Logs;User ID=logger;Password=logger;
" />
    
<
commandText value="INSERT INTO dbo.[NHibernatePerPageQueries]
([Date],[Message],[PageUrl],[RequestId]) VALUES (@log_date, @message,
@pageUrl,@currentRequestId)
" />
    
<
parameter>
           <
parameterName value="@log_date" />
          
<
dbType value="DateTime" />
          
<
layout type="log4net.Layout.RawTimeStampLayout" />
    
</
parameter>
     <
parameter>
           <
parameterName value="@message" />
          
<
dbType value="String" />
          
<
size value="4000" />
          
<
layout type="log4net.Layout.PatternLayout">
               
<
conversionPattern value="%message" />
          
</
layout>
     </
parameter>
     <
parameter>
           <
parameterName value="@pageUrl" />
          
<
dbType value="String" />
          
<
size value="2000" />
          
<
layout type="log4net.Layout.PatternLayout">
               
<
conversionPattern value="%property{nhibernate_page_url}" />
          
</
layout>
     </
parameter>
     <
parameter>
           <
parameterName value="@currentRequestId" />
          
<
dbType value="String" />
          
<
size value="2000" />
          
<
layout type="log4net.Layout.PatternLayout">
               
<
conversionPattern value="%property{current_request_id}" />
          
</
layout>
     </
parameter>
</
appender>


And defining the logger:

<logger name="NHibernate.SQL">
    
<
level value="DEBUG" />
    
<
appender-ref ref="NHibernatePerPageAppender" />
</
logger>


We are still not done, though, we need the following in the Application_Start():

GlobalContext.Properties["nhibernate_page_url"] = WebLoggingHelper.CurrentPage;
GlobalContext
.Properties["current_request_id"] = WebLoggingHelper.CurrentRequestId;


This is it, now I can correlate the number of queries per page hits, and act accordingly.
Normally, I think that the following queries should be enough:

-- Get pages ordered by number of total queries made from them

SELECT COUNT(*) [Number Of Queries In Page], PageUrl
FROM
NHibernatePerPageQueries
WHERE
substring(Message,1,1) != '@' -- remove parameters logs
GROUP
BY PageUrl ORDER BY COUNT(*)

 

-- Get pages ordered by number of queries per page

SELECT AVG(countOfQueries) [Number Of Queries In Page Per Request], PageUrl FROM (
    
SELECT COUNT(*) countOfQueries, PageUrl
     FROM NHibernatePerPageQueries
     WHERE substring(Message,1,1) != '@' -- remove parameters logs
    
GROUP BY PageUrl, RequestId ) innerQuery
GROUP
BY PageUrl ORDER BY AVG(countOfQueries)

 

Enjoy,

FUTURE POSTS

  1. RavenDB & Distributed Debugging - 2 days from now
  2. RavenDB & Ansible - 5 days from now

There are posts all the way to Jul 21, 2025

RECENT SERIES

  1. RavenDB 7.1 (7):
    11 Jul 2025 - The Gen AI release
  2. Production postmorterm (2):
    11 Jun 2025 - The rookie server's untimely promotion
  3. Webinar (7):
    05 Jun 2025 - Think inside the database
  4. Recording (16):
    29 May 2025 - RavenDB's Upcoming Optimizations Deep Dive
  5. RavenDB News (2):
    02 May 2025 - May 2025
View all series

Syndication

Main feed ... ...
Comments feed   ... ...
}