Unverified Commit 9b8bdf36 authored by Tommy Hanks's avatar Tommy Hanks Committed by GitHub

Bring NRediSearch "even" with JRediSearch (#1267)

* The casing in the error message probably changed.

I checked the RediSearch source and found that it's been "Unsupported
language" since at least August 28, 2018.

* `FT.OPTIMIZE` has been deprecated.

Index optimizations are now handled by the internal garbage collector in
the background.

https://oss.redislabs.com/redisearch/Commands.html#ftoptimize

* Started work on porting the aggregation builder class and test from
JRediSearch.

* Ported in static support methods.

* Added Load, Limit, and SortBy

* Finished adding the sortby's

* Group by and apply have been ported.

* Added in the `Filter` method.

* Added in `Cursor` builder method.

Made the private `args` member naming more consistent the rest of the
project by prefixing with an underscore.

* Made this a privately settable property as this is basically what the
Java version had.

* Implemented `ArgsString` property here. This is from the Java
`getArgsString`

* Args list is now accessible just like the Java version.

* Added in serialize redis args method...

* Ported `TestAggregations` test from JRediSearch.

Kept everything as close to the original as possible.

* Marked these methods obsolete in favor of the ones that use the
AggregationBuilder.

* Introduced new overloads for the `Aggregate` method.

Marked the obsolete unit test as obsolete.

* For completeness I added in the commands that should be issued by the
setup portion of this test.

* Ported in the `TestApplyAndFilterAggregations` test.

* Porting over the support for dealing with aggregate cursors.

* Initial pass at implementing cursor delete.

* Initial pass at implementing `CursorRead` and `CursorReadAsync`

* Fixed issue with supplying the Redisearch command with sorting.

Fixed some assertions...

* Added support for return fields.

* Fixed apparently typo.

* Moved this test class to be more consistent with the JRedisearch
library.

* Cleaned up imports here.

* Initial pass at porting the tag for `AlterIndex`.

* Current progress porting the FT.ALTER command.

* Added in a new type for capturing the FT.INFO result...

* The test for AlterIndex (Add) is complete.

* Altered test here to meet assertions from JRediSearch.

* Ported support for the FT.MGET command.

* Ported the Suggestion with Builder from JRediSearch.

* Ported SuggestionOptions.

* Further fleshed out the suggestion options using JRediSearch as a guide.

* Ported over the expanded Suggestions functionality from JRediSearch.

* Ported this from JRediSearch.

* Ported more tests from JRediSearch.

Fixed some silly bugs...

* Ported the last three auto suggest tests from JRediSearch.

* More tests ported from JRediSearch.

* Implemented ability to add multiple documents at once. Started on
deleting multiple documents at once...

* In order to match the functionality found in JRediSearch, I'm catching
RedisServerExceptions that contain the message "Document already in
index".

* Added support for the `INKEYS` search query modifier.

* Ported in a covering test for the AggregationBuilder.

* Cleaned up builder access in the Suggestion and SuggestionOptions
classes.

* Refactored IndexOptions to make them behave more like JRediSearch.

Removed NOSCOREIDX as that has been deprecated.

* PR feedback.

Marked AggregationBuilder as sealed because (at least initially) we
don't expect any inheritance here.

Dropped the call to `ToString` when appending the value for "MAX" to a
SortBy clause because it isn't needed.

Changed `ArgsString` to a method to indicate that we're not simply
exposing internal state. Made the method internal as it's present for
test purposes.

Removed the `Args` property because it's not clear that we need it. It
was ported from JRediSearch because it was there.

* Simplified this with a default value.

* Removed calls to `ToString` in order to allow the library to worry about
proper formatting.

* Combined constructors here to keep things a bit simpler.

* Cleaned up the unused import there.

* Readded the `OptimizeIndex` and `OptimizeIndexAsync` methods here to
preserve backwards compatibility.

* Returning Array.Empty<Document> instead of a new empty array each time.

* Sealed the suggestion class as there should be no reason (currently) for
anything to inherit from it.

Cleaned up how we're ensuring that an object to compare is not null and
is of an appropriate type.

Fixed equality check so that it doesn't blow up on null values (payload
specifically).

* Converted the class `With` to be an enum.

* Now looking up these values on demand.

* Reintroduced the original GetInfo API methods and changed the new ones
to GetInfoParsed and GetInfoParsedAsync.

* Revertered changes that turned ConfiguredIndexOptions into IndexOptions.

Added in `SetNoStopwords` method to match the JRediSearch api as well as
provide a conveinient means for keeping the default index options AND
specifying that no stopwords be considered.

Fixed `TestStopwords` unit test by specifying `STOPWORDS 0` by calling
`SetNoStopwords` which adds the `DisableStopWords` option to the
configured index options.

* Since this optimization doesn't exist anymore it should be removed from
your index definitions.

* Added back the original get suggestions method, but this time it calls
the new version which leverages the suggestion builder.

Added a small covering test.

* Consolidated the constructors here as suggested by Marc.
Co-authored-by: 's avatarNick Craver <nrcraver@gmail.com>
Co-authored-by: 's avatarMarc Gravell <marc.gravell@gmail.com>
parent ef5af00d
// .NET port of https://github.com/RedisLabs/JRediSearch/
using System.Collections.Generic;
using System.Linq;
using NRediSearch.Aggregation.Reducers;
namespace NRediSearch.Aggregation
{
public sealed class AggregationBuilder
{
private readonly List<object> _args = new List<object>();
public bool IsWithCursor { get; private set; }
internal string GetArgsString() => string.Join(" ", _args);
public AggregationBuilder(string query = "*") => _args.Add(query);
public AggregationBuilder Load(params string[] fields)
{
AddCommandArguments(_args, "LOAD", fields);
return this;
}
public AggregationBuilder Limit(int offset, int count)
{
var limit = new Limit(offset, count);
limit.SerializeRedisArgs(_args);
return this;
}
public AggregationBuilder Limit(int count) => Limit(0, count);
public AggregationBuilder SortBy(params SortedField[] fields)
{
_args.Add("SORTBY");
_args.Add(fields.Length * 2);
foreach (var field in fields)
{
_args.Add(field.Field);
_args.Add(field.OrderAsArg());
}
return this;
}
public AggregationBuilder SortBy(int max, params SortedField[] fields)
{
SortBy(fields);
if (max > 0)
{
_args.Add("MAX");
_args.Add(max);
}
return this;
}
public AggregationBuilder SortByAscending(string field) => SortBy(SortedField.Ascending(field));
public AggregationBuilder SortByDescending(string field) => SortBy(SortedField.Descending(field));
public AggregationBuilder Apply(string projection, string alias)
{
_args.Add("APPLY");
_args.Add(projection);
_args.Add("AS");
_args.Add(alias);
return this;
}
public AggregationBuilder GroupBy(IReadOnlyCollection<string> fields, IReadOnlyCollection<Reducer> reducers)
{
var group = new Group(fields.ToArray());
foreach (var r in reducers)
{
group.Reduce(r);
}
GroupBy(group);
return this;
}
public AggregationBuilder GroupBy(string field, params Reducer[] reducers) => GroupBy(new[] { field }, reducers);
public AggregationBuilder GroupBy(Group group)
{
_args.Add("GROUPBY");
group.SerializeRedisArgs(_args);
return this;
}
public AggregationBuilder Filter(string expression)
{
_args.Add("FILTER");
_args.Add(expression);
return this;
}
public AggregationBuilder Cursor(int count, long maxIdle)
{
IsWithCursor = true;
if (count > 0)
{
_args.Add("WITHCURSOR");
_args.Add("COUNT");
_args.Add(count);
if (maxIdle < long.MaxValue && maxIdle >= 0)
{
_args.Add("MAXIDLE");
_args.Add(maxIdle);
}
}
return this;
}
internal void SerializeRedisArgs(List<object> args)
{
foreach (var arg in _args)
{
args.Add(arg);
}
}
private static void AddCommandLength(List<object> list, string command, int length)
{
list.Add(command);
list.Add(length);
}
private static void AddCommandArguments(List<object> destination, string command, IReadOnlyCollection<object> source)
{
AddCommandLength(destination, command, source.Count);
destination.AddRange(source);
}
}
}
......@@ -9,7 +9,8 @@ namespace NRediSearch
public sealed class AggregationResult
{
private readonly Dictionary<string, RedisValue>[] _results;
internal AggregationResult(RedisResult result)
internal AggregationResult(RedisResult result, long cursorId = -1)
{
var arr = (RedisResult[])result;
......@@ -27,6 +28,8 @@ internal AggregationResult(RedisResult result)
}
_results[i - 1] = cur;
}
CursorId = cursorId;
}
public IReadOnlyList<Dictionary<string, RedisValue>> GetResults() => _results;
......@@ -38,5 +41,7 @@ internal AggregationResult(RedisResult result)
if (index >= _results.Length) return null;
return new Row(_results[index]);
}
public long CursorId { get; }
}
}
// .NET port of https://github.com/RedisLabs/JRediSearch/
using NRediSearch.Aggregation;
using StackExchange.Redis;
using System;
using System.Collections.Generic;
using System.Threading.Tasks;
using NRediSearch.Aggregation;
using StackExchange.Redis;
using static NRediSearch.Schema;
using static NRediSearch.SuggestionOptions;
namespace NRediSearch
{
......@@ -28,25 +30,33 @@ public enum IndexOptions
/// </summary>
KeepFieldFlags = 2,
/// <summary>
/// The default indexing options - use term offsets and keep fields flags
/// The default indexing options - use term offsets, keep fields flags, keep term frequencies
/// </summary>
Default = UseTermOffsets | KeepFieldFlags,
Default = UseTermOffsets | KeepFieldFlags | KeepTermFrequencies,
/// <summary>
/// If set, we keep an index of the top entries per term, allowing extremely fast single word queries
/// regardless of index size, at the cost of more memory
/// </summary>
[Obsolete("'NOSCOREIDX' was removed from RediSearch.", true)]
UseScoreIndexes = 4,
/// <summary>
/// If set, we will disable the Stop-Words completely
/// </summary>
DisableStopWords = 8
DisableStopWords = 8,
/// <summary>
/// If set, we keep an index of the top entries per term, allowing extremely fast single word queries
/// regardless of index size, at the cost of more memory
/// </summary>
KeepTermFrequencies = 16
}
public sealed class ConfiguredIndexOptions
{
public static IndexOptions Default => new IndexOptions();
private IndexOptions _options;
private string[] _stopwords;
public ConfiguredIndexOptions(IndexOptions options)
public ConfiguredIndexOptions(IndexOptions options = IndexOptions.Default)
{
_options = options;
}
......@@ -63,34 +73,39 @@ public ConfiguredIndexOptions SetStopwords(params string[] stopwords)
return this;
}
public ConfiguredIndexOptions SetNoStopwords()
{
_options |= IndexOptions.DisableStopWords;
return this;
}
internal void SerializeRedisArgs(List<object> args)
{
SerializeRedisArgs(_options, args);
if (_stopwords != null && _stopwords.Length != 0)
{
// note that DisableStopWords will not be set in this case
args.Add("STOPWORDS".Literal());
args.Add(_stopwords.Length.Boxed());
foreach (var word in _stopwords)
args.Add(word);
args.AddRange(_stopwords);
}
}
internal static void SerializeRedisArgs(IndexOptions flags, List<object> args)
internal static void SerializeRedisArgs(IndexOptions options, List<object> args)
{
if ((flags & IndexOptions.UseTermOffsets) == 0)
if ((options & IndexOptions.UseTermOffsets) == 0)
{
args.Add("NOOFFSETS".Literal());
}
if ((flags & IndexOptions.KeepFieldFlags) == 0)
if ((options & IndexOptions.KeepFieldFlags) == 0)
{
args.Add("NOFIELDS".Literal());
}
if ((flags & IndexOptions.UseScoreIndexes) == 0)
if ((options & IndexOptions.KeepTermFrequencies) == 0)
{
args.Add("NOSCOREIDX".Literal());
args.Add("NOFREQS".Literal());
}
if ((flags & IndexOptions.DisableStopWords) == IndexOptions.DisableStopWords)
if ((options & IndexOptions.DisableStopWords) == IndexOptions.DisableStopWords)
{
args.Add("STOPWORDS".Literal());
args.Add(0.Boxed());
......@@ -118,13 +133,13 @@ public Client(RedisKey indexName, IDatabaseAsync db)
/// <param name="schema">a schema definition <seealso cref="Schema"/></param>
/// <param name="options">index option flags <seealso cref="IndexOptions"/></param>
/// <returns>true if successful</returns>
public bool CreateIndex(Schema schema, IndexOptions options)
public bool CreateIndex(Schema schema, ConfiguredIndexOptions options)
{
var args = new List<object>
{
_boxedIndexName
};
ConfiguredIndexOptions.SerializeRedisArgs(options, args);
options.SerializeRedisArgs(args);
args.Add("SCHEMA".Literal());
foreach (var f in schema.Fields)
......@@ -141,7 +156,7 @@ public bool CreateIndex(Schema schema, IndexOptions options)
/// <param name="schema">a schema definition <seealso cref="Schema"/></param>
/// <param name="options">index option flags <seealso cref="IndexOptions"/></param>
/// <returns>true if successful</returns>
public bool CreateIndex(Schema schema, ConfiguredIndexOptions options)
public async Task<bool> CreateIndexAsync(Schema schema, ConfiguredIndexOptions options)
{
var args = new List<object>
{
......@@ -155,53 +170,51 @@ public bool CreateIndex(Schema schema, ConfiguredIndexOptions options)
f.SerializeRedisArgs(args);
}
return (string)DbSync.Execute("FT.CREATE", args) == "OK";
return (string)await _db.ExecuteAsync("FT.CREATE", args).ConfigureAwait(false) == "OK";
}
/// <summary>
/// Create the index definition in redis
/// Alter index add fields
/// </summary>
/// <param name="schema">a schema definition <seealso cref="Schema"/></param>
/// <param name="options">index option flags <seealso cref="IndexOptions"/></param>
/// <returns>true if successful</returns>
public async Task<bool> CreateIndexAsync(Schema schema, IndexOptions options)
/// <param name="fields">list of fields</param>
/// <returns>`true` is successful</returns>
public bool AlterIndex(params Field[] fields)
{
var args = new List<object>
{
_boxedIndexName
_boxedIndexName,
"SCHEMA".Literal(),
"ADD".Literal()
};
ConfiguredIndexOptions.SerializeRedisArgs(options, args);
args.Add("SCHEMA".Literal());
foreach (var f in schema.Fields)
foreach (var field in fields)
{
f.SerializeRedisArgs(args);
field.SerializeRedisArgs(args);
}
return (string)await _db.ExecuteAsync("FT.CREATE", args).ConfigureAwait(false) == "OK";
return (string)DbSync.Execute("FT.ALTER", args) == "OK";
}
/// <summary>
/// Create the index definition in redis
/// Alter index add fields
/// </summary>
/// <param name="schema">a schema definition <seealso cref="Schema"/></param>
/// <param name="options">index option flags <seealso cref="IndexOptions"/></param>
/// <returns>true if successful</returns>
public async Task<bool> CreateIndexAsync(Schema schema, ConfiguredIndexOptions options)
/// <param name="fields">list of fields</param>
/// <returns>`true` is successful</returns>
public async Task<bool> AlterIndexAsync(params Field[] fields)
{
var args = new List<object>
{
_boxedIndexName
_boxedIndexName,
"SCHEMA".Literal(),
"ADD".Literal()
};
options.SerializeRedisArgs(args);
args.Add("SCHEMA".Literal());
foreach (var f in schema.Fields)
foreach (var field in fields)
{
f.SerializeRedisArgs(args);
field.SerializeRedisArgs(args);
}
return (string)await _db.ExecuteAsync("FT.CREATE", args).ConfigureAwait(false) == "OK";
return (string)(await _db.ExecuteAsync("FT.ALTER", args).ConfigureAwait(false)) == "OK";
}
/// <summary>
......@@ -278,23 +291,41 @@ public bool AddDocument(string docId, Dictionary<string, RedisValue> fields, dou
/// <param name="noSave">if set, we only index the document and do not save its contents. This allows fetching just doc ids</param>
/// <param name="replace">if set, and the document already exists, we reindex and update it</param>
/// <param name="payload">if set, we can save a payload in the index to be retrieved or evaluated by scoring functions on the server</param>
/// <returns>true if the operation succeeded, false otherwise</returns>
public async Task<bool> AddDocumentAsync(string docId, Dictionary<string, RedisValue> fields, double score = 1.0, bool noSave = false, bool replace = false, byte[] payload = null)
{
var args = BuildAddDocumentArgs(docId, fields, score, noSave, replace, payload);
try
{
return (string)await _db.ExecuteAsync("FT.ADD", args).ConfigureAwait(false) == "OK";
}
catch (RedisServerException ex) when (ex.Message == "Document already in index")
{
return false;
}
}
/// <summary>
/// Add a document to the index
/// </summary>
/// <param name="doc">The document to add</param>
/// <param name="options">Options for the operation</param>
/// <returns>true if the operation succeeded, false otherwise. Note that if the operation fails, an exception will be thrown</returns>
/// <returns>true if the operation succeeded, false otherwise</returns>
public bool AddDocument(Document doc, AddOptions options = null)
{
var args = BuildAddDocumentArgs(doc.Id, doc._properties, doc.Score, options?.NoSave ?? false, options?.ReplacePolicy ?? AddOptions.ReplacementPolicy.None, doc.Payload, options?.Language);
try
{
return (string)DbSync.Execute("FT.ADD", args) == "OK";
}
catch (RedisServerException ex) when (ex.Message == "Document already in index")
{
return false;
}
}
/// <summary>
/// Add a document to the index
......@@ -308,6 +339,58 @@ public async Task<bool> AddDocumentAsync(Document doc, AddOptions options = null
return (string)await _db.ExecuteAsync("FT.ADD", args).ConfigureAwait(false) == "OK";
}
/// <summary>
/// Add a batch of documents to the index.
/// </summary>
/// <param name="documents">The documents to add</param>
/// <returns>`true` on success for each document</returns>
public bool[] AddDocuments(params Document[] documents) =>
AddDocuments(new AddOptions(), documents);
/// <summary>
/// Add a batch of documents to the index
/// </summary>
/// <param name="options">Options for the operation</param>
/// <param name="documents">The documents to add</param>
/// <returns>`true` on success for each document</returns>
public bool[] AddDocuments(AddOptions options, params Document[] documents)
{
var result = new bool[documents.Length];
for (var i = 0; i < documents.Length; i++)
{
result[i] = AddDocument(documents[i], options);
}
return result;
}
/// <summary>
/// Add a batch of documents to the index.
/// </summary>
/// <param name="documents">The documents to add</param>
/// <returns>`true` on success for each document</returns>
public Task<bool[]> AddDocumentsAsync(params Document[] documents) =>
AddDocumentsAsync(new AddOptions(), documents);
/// <summary>
/// Add a batch of documents to the index
/// </summary>
/// <param name="options">Options for the operation</param>
/// <param name="documents">The documents to add</param>
/// <returns>`true` on success for each document</returns>
public async Task<bool[]> AddDocumentsAsync(AddOptions options, params Document[] documents)
{
var result = new bool[documents.Length];
for (var i = 0; i < documents.Length; i++)
{
result[i] = await AddDocumentAsync(documents[i], options);
}
return result;
}
private List<object> BuildAddDocumentArgs(string docId, Dictionary<string, RedisValue> fields, double score, bool noSave, bool replace, byte[] payload)
=> BuildAddDocumentArgs(docId, fields, score, noSave, replace ? AddOptions.ReplacementPolicy.Full : AddOptions.ReplacementPolicy.None, payload, null);
private List<object> BuildAddDocumentArgs(string docId, Dictionary<string, RedisValue> fields, double score, bool noSave, AddOptions.ReplacementPolicy replacementPolicy, byte[] payload, string language)
......@@ -419,17 +502,15 @@ public async Task<bool> AddHashAsync(RedisKey docId, double score, bool replace)
}
/// <summary>
/// Get the index info, including memory consumption and other statistics.
/// Get the index info, including memory consumption and other statistics
/// </summary>
/// <remarks>TODO: Make a class for easier access to the index properties</remarks>
/// <returns>a map of key/value pairs</returns>
public Dictionary<string, RedisValue> GetInfo() =>
ParseGetInfo(DbSync.Execute("FT.INFO", _boxedIndexName));
/// <summary>
/// Get the index info, including memory consumption and other statistics.
/// Get the index info, including memory consumption and other statistics
/// </summary>
/// <remarks>TODO: Make a class for easier access to the index properties</remarks>
/// <returns>a map of key/value pairs</returns>
public async Task<Dictionary<string, RedisValue>> GetInfoAsync() =>
ParseGetInfo(await _db.ExecuteAsync("FT.INFO", _boxedIndexName).ConfigureAwait(false));
......@@ -449,24 +530,100 @@ public async Task<bool> AddHashAsync(RedisKey docId, double score, bool replace)
return info;
}
/// <summary>
/// Get the index info, including memory consumption and other statistics.
/// </summary>
/// <returns>An `InfoResult` object with parsed values from the FT.INFO command.</returns>
public InfoResult GetInfoParsed() =>
new InfoResult(DbSync.Execute("FT.INFO", _boxedIndexName));
/// <summary>
/// Get the index info, including memory consumption and other statistics.
/// </summary>
/// <returns>An `InfoResult` object with parsed values from the FT.INFO command.</returns>
public async Task<InfoResult> GetInfoParsedAsync() =>
new InfoResult(await _db.ExecuteAsync("FT.INFO", _boxedIndexName).ConfigureAwait(false));
/// <summary>
/// Delete a document from the index.
/// </summary>
/// <param name="docId">the document's id</param>
/// <param name="deleteDocument">if <code>true</code> also deletes the actual document if it is in the index</param>
/// <returns>true if it has been deleted, false if it did not exist</returns>
public bool DeleteDocument(string docId)
public bool DeleteDocument(string docId, bool deleteDocument = false)
{
var args = new List<object>
{
_boxedIndexName,
docId
};
if (deleteDocument)
{
return (long)DbSync.Execute("FT.DEL", _boxedIndexName, docId) == 1;
args.Add("DD".Literal());
}
return (long)DbSync.Execute("FT.DEL", args) == 1;
}
/// <summary>
/// Delete a document from the index.
/// </summary>
/// <param name="docId">the document's id</param>
/// <param name="docId">the document's id</param>
/// <returns>true if it has been deleted, false if it did not exist</returns>
public async Task<bool> DeleteDocumentAsync(string docId)
public async Task<bool> DeleteDocumentAsync(string docId, bool deleteDocument = false)
{
var args = new List<object>
{
return (long)await _db.ExecuteAsync("FT.DEL", _boxedIndexName, docId).ConfigureAwait(false) == 1;
_boxedIndexName,
docId
};
if (deleteDocument)
{
args.Add("DD".Literal());
}
return (long)await _db.ExecuteAsync("FT.DEL", args).ConfigureAwait(false) == 1;
}
/// <summary>
/// Delete multiple documents from an index.
/// </summary>
/// <param name="deleteDocuments">if <code>true</code> also deletes the actual document ifs it is in the index</param>
/// <param name="docIds">the document ids to delete</param>
/// <returns>true on success for each document if it has been deleted, false if it did not exist</returns>
public bool[] DeleteDocuments(bool deleteDocuments, params string[] docIds)
{
var result = new bool[docIds.Length];
for (var i = 0; i < docIds.Length; i++)
{
result[i] = DeleteDocument(docIds[i], deleteDocuments);
}
return result;
}
/// <summary>
/// Delete multiple documents from an index.
/// </summary>
/// <param name="deleteDocuments">if <code>true</code> also deletes the actual document ifs it is in the index</param>
/// <param name="docIds">the document ids to delete</param>
/// <returns>true on success for each document if it has been deleted, false if it did not exist</returns>
public async Task<bool[]> DeleteDocumentsAsync(bool deleteDocuments, params string[] docIds)
{
var result = new bool[docIds.Length];
for (var i = 0; i < docIds.Length; i++)
{
result[i] = await DeleteDocumentAsync(docIds[i], deleteDocuments);
}
return result;
}
/// <summary>
......@@ -487,19 +644,21 @@ public async Task<bool> DropIndexAsync()
}
/// <summary>
/// Optimize memory consumption of the index by removing extra saved capacity. This does not affect speed
/// [Deprecated] Optimize memory consumption of the index by removing extra saved capacity. This does not affect speed
/// </summary>
[Obsolete("Index optimizations are done by the internal garbage collector in the background.")]
public long OptimizeIndex()
{
return (long)DbSync.Execute("FT.OPTIMIZE", _boxedIndexName);
return default;
}
/// <summary>
/// Optimize memory consumption of the index by removing extra saved capacity. This does not affect speed
/// [Deprecated] Optimize memory consumption of the index by removing extra saved capacity. This does not affect speed
/// </summary>
public async Task<long> OptimizeIndexAsync()
[Obsolete("Index optimizations are done by the internal garbage collector in the background.")]
public Task<long> OptimizeIndexAsync()
{
return (long)await _db.ExecuteAsync("FT.OPTIMIZE", _boxedIndexName).ConfigureAwait(false);
return Task.FromResult(default(long));
}
/// <summary>
......@@ -517,30 +676,58 @@ public async Task<long> CountSuggestionsAsync()
/// <summary>
/// Add a suggestion string to an auto-complete suggestion dictionary. This is disconnected from the index definitions, and leaves creating and updating suggestino dictionaries to the user.
/// </summary>
/// <param name="value">the suggestion string we index</param>
/// <param name="score">a floating point number of the suggestion string's weight</param>
/// <param name="suggestion">the Suggestion to be added</param>
/// <param name="increment">if set, we increment the existing entry of the suggestion by the given score, instead of replacing the score. This is useful for updating the dictionary based on user queries in real time</param>
/// <returns>the current size of the suggestion dictionary.</returns>
public long AddSuggestion(string value, double score, bool increment = false)
public long AddSuggestion(Suggestion suggestion, bool increment = false)
{
var args = new List<object>
{
object[] args = increment
? new object[] { _boxedIndexName, value, score, "INCR".Literal() }
: new object[] { _boxedIndexName, value, score };
_boxedIndexName,
suggestion.String,
suggestion.Score
};
if (increment)
{
args.Add("INCR".Literal());
}
if (suggestion.Payload != null)
{
args.Add("PAYLOAD".Literal());
args.Add(suggestion.Payload);
}
return (long)DbSync.Execute("FT.SUGADD", args);
}
/// <summary>
/// Add a suggestion string to an auto-complete suggestion dictionary. This is disconnected from the index definitions, and leaves creating and updating suggestino dictionaries to the user.
/// </summary>
/// <param name="value">the suggestion string we index</param>
/// <param name="score">a floating point number of the suggestion string's weight</param>
/// <param name="suggestion">the Suggestion to be added</param>
/// <param name="increment">if set, we increment the existing entry of the suggestion by the given score, instead of replacing the score. This is useful for updating the dictionary based on user queries in real time</param>
/// <returns>the current size of the suggestion dictionary.</returns>
public async Task<long> AddSuggestionAsync(string value, double score, bool increment = false)
public async Task<long> AddSuggestionAsync(Suggestion suggestion, bool increment = false)
{
var args = new List<object>
{
object[] args = increment
? new object[] { _boxedIndexName, value, score, "INCR".Literal() }
: new object[] { _boxedIndexName, value, score };
_boxedIndexName,
suggestion.String,
suggestion.Score
};
if (increment)
{
args.Add("INCR".Literal());
}
if (suggestion.Payload != null)
{
args.Add("PAYLOAD".Literal());
args.Add(suggestion.Payload);
}
return (long)await _db.ExecuteAsync("FT.SUGADD", args).ConfigureAwait(false);
}
......@@ -567,15 +754,76 @@ public async Task<bool> DeleteSuggestionAsync(string value)
/// <returns>a list of the top suggestions matching the prefix</returns>
public string[] GetSuggestions(string prefix, bool fuzzy = false, int max = 5)
{
var args = new List<object> { _boxedIndexName, prefix };
if (fuzzy) args.Add("FUZZY".Literal());
if (max != 5)
var optionsBuilder = SuggestionOptions.Builder.Max(max);
if (fuzzy)
{
optionsBuilder.Fuzzy();
}
var suggestions = GetSuggestions(prefix, optionsBuilder.Build());
var result = new string[suggestions.Length];
for (var i = 0; i < suggestions.Length; i++)
{
result[i] = suggestions[i].String;
}
return result;
}
/// <summary>
/// Get completion suggestions for a prefix
/// </summary>
/// <param name="prefix">the prefix to complete on</param>
/// <param name="suggestionOptions"> the options on what you need returned and other usage</param>
/// <returns>a list of the top suggestions matching the prefix</returns>
public Suggestion[] GetSuggestions(string prefix, SuggestionOptions options)
{
var args = new List<object>
{
_boxedIndexName,
prefix,
"MAX".Literal(),
options.Max.Boxed()
};
if (options.Fuzzy)
{
args.Add("FUZZY".Literal());
}
if (options.With != WithOptions.None)
{
args.AddRange(options.GetFlags());
}
var results = (RedisResult[])DbSync.Execute("FT.SUGGET", args);
if (options.With == WithOptions.None)
{
return GetSuggestionsNoOptions(results);
}
if (options.GetIsPayloadAndScores())
{
return GetSuggestionsWithPayloadAndScores(results);
}
if (options.GetIsPayload())
{
args.Add("MAX".Literal());
args.Add(max.Boxed());
return GetSuggestionsWithPayload(results);
}
return (string[])DbSync.Execute("FT.SUGGET", args);
if (options.GetIsScores())
{
return GetSuggestionsWithScores(results);
}
return default;
}
/// <summary>
/// Get completion suggestions for a prefix
/// </summary>
......@@ -585,20 +833,82 @@ public string[] GetSuggestions(string prefix, bool fuzzy = false, int max = 5)
/// <returns>a list of the top suggestions matching the prefix</returns>
public async Task<string[]> GetSuggestionsAsync(string prefix, bool fuzzy = false, int max = 5)
{
var args = new List<object> { _boxedIndexName, prefix };
if (fuzzy) args.Add("FUZZY".Literal());
if (max != 5)
var optionsBuilder = SuggestionOptions.Builder.Max(max);
if (fuzzy)
{
optionsBuilder.Fuzzy();
}
var suggestions = await GetSuggestionsAsync(prefix, optionsBuilder.Build());
var result = new string[suggestions.Length];
for(var i = 0; i < suggestions.Length; i++)
{
result[i] = suggestions[i].String;
}
return result;
}
/// <summary>
/// Get completion suggestions for a prefix
/// </summary>
/// <param name="prefix">the prefix to complete on</param>
/// <param name="suggestionOptions"> the options on what you need returned and other usage</param>
/// <returns>a list of the top suggestions matching the prefix</returns>
public async Task<Suggestion[]> GetSuggestionsAsync(string prefix, SuggestionOptions options)
{
var args = new List<object>
{
args.Add("MAX".Literal());
args.Add(max.Boxed());
_boxedIndexName,
prefix,
"MAX".Literal(),
options.Max.Boxed()
};
if (options.Fuzzy)
{
args.Add("FUZZY".Literal());
}
return (string[])await _db.ExecuteAsync("FT.SUGGET", args).ConfigureAwait(false);
if (options.With != WithOptions.None)
{
args.AddRange(options.GetFlags());
}
var results = (RedisResult[])await _db.ExecuteAsync("FT.SUGGET", args).ConfigureAwait(false);
if (options.With == WithOptions.None)
{
return GetSuggestionsNoOptions(results);
}
if (options.GetIsPayloadAndScores())
{
return GetSuggestionsWithPayloadAndScores(results);
}
if (options.GetIsPayload())
{
return GetSuggestionsWithPayload(results);
}
if (options.GetIsScores())
{
return GetSuggestionsWithScores(results);
}
return default;
}
/// <summary>
/// Perform an aggregate query
/// </summary>
/// <param name="query">The query to watch</param>
[Obsolete("Use `Aggregate` method that takes an `AggregationBuilder`.")]
public AggregationResult Aggregate(AggregationRequest query)
{
var args = new List<object>
......@@ -611,10 +921,12 @@ public AggregationResult Aggregate(AggregationRequest query)
return new AggregationResult(resp);
}
/// <summary>
/// Perform an aggregate query
/// </summary>
/// <param name="query">The query to watch</param>
[Obsolete("Use `AggregateAsync` method that takes an `AggregationBuilder`.")]
public async Task<AggregationResult> AggregateAsync(AggregationRequest query)
{
var args = new List<object>
......@@ -628,6 +940,148 @@ public async Task<AggregationResult> AggregateAsync(AggregationRequest query)
return new AggregationResult(resp);
}
/// <summary>
/// Perform an aggregate query
/// </summary>
/// <param name="query">The query to watch</param>
public AggregationResult Aggregate(AggregationBuilder query)
{
var args = new List<object>
{
_boxedIndexName
};
query.SerializeRedisArgs(args);
var resp = DbSync.Execute("FT.AGGREGATE", args);
if (query.IsWithCursor)
{
var respArray = (RedisResult[])resp;
return new AggregationResult(respArray[0], (long)respArray[1]);
}
else
{
return new AggregationResult(resp);
}
}
/// <summary>
/// Perform an aggregate query
/// </summary>
/// <param name="query">The query to watch</param>
public async Task<AggregationResult> AggregateAsync(AggregationBuilder query)
{
var args = new List<object>
{
_boxedIndexName
};
query.SerializeRedisArgs(args);
var resp = await _db.ExecuteAsync("FT.AGGREGATE", args).ConfigureAwait(false);
if (query.IsWithCursor)
{
var respArray = (RedisResult[])resp;
return new AggregationResult(respArray[0], (long)respArray[1]);
}
else
{
return new AggregationResult(resp);
}
}
/// <summary>
/// Read from an existing aggregate cursor.
/// </summary>
/// <param name="cursorId">The cursor's ID.</param>
/// <param name="count">Limit the amount of returned results.</param>
/// <returns>A AggregationResult object with the results</returns>
public AggregationResult CursorRead(long cursorId, int count = -1)
{
var args = new List<object>
{
"READ",
_boxedIndexName,
cursorId
};
if (count > -1)
{
args.Add("COUNT");
args.Add(count);
}
RedisResult[] resp = (RedisResult[])DbSync.Execute("FT.CURSOR", args);
return new AggregationResult(resp[0], (long)resp[1]);
}
/// <summary>
/// Read from an existing aggregate cursor.
/// </summary>
/// <param name="cursorId">The cursor's ID.</param>
/// <param name="count">Limit the amount of returned results.</param>
/// <returns>A AggregationResult object with the results</returns>
public async Task<AggregationResult> CursorReadAsync(long cursorId, int count)
{
var args = new List<object>
{
"READ",
_boxedIndexName,
cursorId
};
if (count > -1)
{
args.Add("COUNT");
args.Add(count);
}
RedisResult[] resp = (RedisResult[])(await _db.ExecuteAsync("FT.CURSOR", args).ConfigureAwait(false));
return new AggregationResult(resp[0], (long)resp[1]);
}
/// <summary>
/// Delete a cursor from the index.
/// </summary>
/// <param name="cursorId">The cursor's ID.</param>
/// <returns>`true` if it has been deleted, `false` if it did not exist.</returns>
public bool CursorDelete(long cursorId)
{
var args = new List<object>
{
"DEL",
_boxedIndexName,
cursorId
};
return (string)DbSync.Execute("FT.CURSOR", args) == "OK";
}
/// <summary>
/// Delete a cursor from the index.
/// </summary>
/// <param name="cursorId">The cursor's ID.</param>
/// <returns>`true` if it has been deleted, `false` if it did not exist.</returns>
public async Task<bool> CursorDeleteAsync(long cursorId)
{
var args = new List<object>
{
"DEL",
_boxedIndexName,
cursorId
};
return (string)(await _db.ExecuteAsync("FT.CURSOR", args).ConfigureAwait(false)) == "OK";
}
/// <summary>
/// Generate an explanatory textual query tree for this query string
/// </summary>
......@@ -674,6 +1128,92 @@ public Document GetDocument(string docId)
public async Task<Document> GetDocumentAsync(string docId)
=> Document.Parse(docId, await _db.ExecuteAsync("FT.GET", _boxedIndexName, docId).ConfigureAwait(false));
/// <summary>
/// Gets a series of documents from the index.
/// </summary>
/// <param name="docIds">The document IDs to retrieve.</param>
/// <returns>The documents stored in the index. If the document does not exist, null is returned in the list.</returns>
public Document[] GetDocuments(params string[] docIds)
{
if (docIds.Length == 0)
{
return Array.Empty<Document>();
}
var args = new List<object>
{
_boxedIndexName
};
foreach (var docId in docIds)
{
args.Add(docId);
}
var queryResults = (RedisResult[])DbSync.Execute("FT.MGET", args);
var result = new Document[docIds.Length];
for (var i = 0; i < docIds.Length; i++)
{
var queryResult = queryResults[i];
if (queryResult.IsNull)
{
result[i] = null;
}
else
{
result[i] = Document.Parse(docIds[i], queryResult);
}
}
return result;
}
/// <summary>
/// Gets a series of documents from the index.
/// </summary>
/// <param name="docIds">The document IDs to retrieve.</param>
/// <returns>The documents stored in the index. If the document does not exist, null is returned in the list.</returns>
public async Task<Document[]> GetDocumentsAsync(params string[] docIds)
{
if (docIds.Length == 0)
{
return new Document[] { };
}
var args = new List<object>
{
_boxedIndexName
};
foreach (var docId in docIds)
{
args.Add(docId);
}
var queryResults = (RedisResult[])await _db.ExecuteAsync("FT.MGET", args).ConfigureAwait(false);
var result = new Document[docIds.Length];
for (var i = 0; i < docIds.Length; i++)
{
var queryResult = queryResults[i];
if (queryResult.IsNull)
{
result[i] = null;
}
else
{
result[i] = Document.Parse(docIds[i], queryResult);
}
}
return result;
}
/// <summary>
/// Replace specific fields in a document. Unlike #replaceDocument(), fields not present in the field list
/// are not erased, but retained. This avoids reindexing the entire document if the new values are not
......@@ -701,5 +1241,69 @@ public async Task<bool> UpdateDocumentAsync(string docId, Dictionary<string, Red
var args = BuildAddDocumentArgs(docId, fields, score, false, AddOptions.ReplacementPolicy.Partial, null, null);
return (string)await _db.ExecuteAsync("FT.ADD", args).ConfigureAwait(false) == "OK";
}
private static Suggestion[] GetSuggestionsNoOptions(RedisResult[] results)
{
var suggestions = new Suggestion[results.Length];
for (var i = 0; i < results.Length; i++)
{
suggestions[i] = Suggestion.Builder.String((string)results[i]).Build();
}
return suggestions;
}
private static Suggestion[] GetSuggestionsWithPayloadAndScores(RedisResult[] results)
{
var suggestions = new Suggestion[results.Length / 3];
for (var i = 3; i <= results.Length; i += 3)
{
var suggestion = Suggestion.Builder;
suggestion.String((string)results[i - 3]);
suggestion.Score((double)results[i - 2]);
suggestion.Payload((string)results[i - 1]);
suggestions[(i / 3) - 1] = suggestion.Build();
}
return suggestions;
}
private static Suggestion[] GetSuggestionsWithPayload(RedisResult[] results)
{
var suggestions = new Suggestion[results.Length / 2];
for (var i = 2; i <= results.Length; i += 2)
{
var suggestion = Suggestion.Builder;
suggestion.String((string)results[i - 2]);
suggestion.Payload((string)results[i - 1]);
suggestions[(i / 2) - 1] = suggestion.Build();
}
return suggestions;
}
private static Suggestion[] GetSuggestionsWithScores(RedisResult[] results)
{
var suggestions = new Suggestion[results.Length / 2];
for (var i = 2; i <= results.Length; i += 2)
{
var suggestion = Suggestion.Builder;
suggestion.String((string)results[i - 2]);
suggestion.Score((double)results[i - 1]);
suggestions[(i / 2) - 1] = suggestion.Build();
}
return suggestions;
}
}
}
// .NET port of https://github.com/RedisLabs/JRediSearch/
using System;
using System.Collections.Generic;
using StackExchange.Redis;
......@@ -18,7 +17,7 @@ public class Document
public Document(string id, double score, byte[] payload) : this(id, null, score, payload) { }
public Document(string id) : this(id, null, 1.0, null) { }
public Document(string id, Dictionary<string, RedisValue> fields, double score) : this(id, fields, score, null) { }
public Document(string id, Dictionary<string, RedisValue> fields, double score = 1.0) : this(id, fields, score, null) { }
public Document(string id, Dictionary<string, RedisValue> fields, double score, byte[] payload)
{
......
......@@ -3,20 +3,11 @@
using System;
using System.Globalization;
using StackExchange.Redis;
using static NRediSearch.Client;
namespace NRediSearch
{
public static class Extensions
{
/// <summary>
/// Set a custom stopword list
/// </summary>
/// <param name="options">The <see cref="IndexOptions"/> to set stopwords on.</param>
/// <param name="stopwords">The stopwords to set.</param>
public static ConfiguredIndexOptions SetStopwords(this IndexOptions options, params string[] stopwords)
=> new ConfiguredIndexOptions(options).SetStopwords(stopwords);
internal static string AsRedisString(this double value, bool forceDecimal = false)
{
if (double.IsNegativeInfinity(value))
......
using System.Collections.Generic;
using StackExchange.Redis;
namespace NRediSearch
{
public class InfoResult
{
private readonly Dictionary<string, RedisResult> _all = new Dictionary<string, RedisResult>();
public string IndexName => GetString("index_name");
public Dictionary<string, RedisResult[]> Fields => GetRedisResultsDictionary("fields");
public long NumDocs => GetLong("num_docs");
public long NumTerms => GetLong("num_terms");
public long NumRecords => GetLong("num_records");
public double InvertedSzMebibytes => GetDouble("inverted_sz_mb");
public double InvertedCapMebibytes => GetDouble("inverted_cap_mb");
public double InvertedCapOvh => GetDouble("inverted_cap_ovh");
public double OffsetVectorsSzMebibytes => GetDouble("offset_vectors_sz_mb");
public double SkipIndexSizeMebibytes => GetDouble("skip_index_size_mb");
public double ScoreIndexSizeMebibytes => GetDouble("score_index_size_mb");
public double RecordsPerDocAvg => GetDouble("records_per_doc_avg");
public double BytesPerRecordAvg => GetDouble("bytes_per_record_avg");
public double OffsetsPerTermAvg => GetDouble("offsets_per_term_avg");
public double OffsetBitsPerRecordAvg => GetDouble("offset_bits_per_record_avg");
public string MaxDocId => GetString("max_doc_id");
public double DocTableSizeMebibytes => GetDouble("doc_table_size_mb");
public double SortableValueSizeMebibytes => GetDouble("sortable_value_size_mb");
public double KeyTableSizeMebibytes => GetDouble("key_table_size_mb");
public Dictionary<string, RedisResult> GcStats => GetRedisResultDictionary("gc_stats");
public Dictionary<string, RedisResult> CursorStats => GetRedisResultDictionary("cursor_stats");
public InfoResult(RedisResult result)
{
var results = (RedisResult[])result;
for (var i = 0; i < results.Length; i += 2)
{
var key = (string)results[i];
var value = results[i + 1];
_all.Add(key, value);
}
}
private string GetString(string key) => _all.TryGetValue(key, out var value) ? (string)value : default;
private long GetLong(string key) => _all.TryGetValue(key, out var value) ? (long)value : default;
private double GetDouble(string key)
{
if (_all.TryGetValue(key, out var value))
{
if ((string)value == "-nan")
{
return default;
}
else
{
return (double)value;
}
}
else
{
return default;
}
}
private Dictionary<string, RedisResult> GetRedisResultDictionary(string key)
{
if (_all.TryGetValue(key, out var value))
{
var values = (RedisResult[])value;
var result = new Dictionary<string, RedisResult>();
for (var ii = 0; ii < values.Length; ii += 2)
{
result.Add((string)values[ii], values[ii + 1]);
}
return result;
}
else
{
return default;
}
}
private Dictionary<string, RedisResult[]> GetRedisResultsDictionary(string key)
{
if (_all.TryGetValue(key, out var value))
{
var result = new Dictionary<string, RedisResult[]>();
foreach (RedisResult[] fv in (RedisResult[])value)
{
result.Add((string)fv[0], fv);
}
return result;
}
else
{
return default;
}
}
}
}
// .NET port of https://github.com/RedisLabs/JRediSearch/
using System;
using System.Collections.Generic;
using System.Globalization;
using StackExchange.Redis;
......@@ -301,6 +300,28 @@ internal void SerializeRedisArgs(List<object> args)
args.Add(_summarizeSeparator);
}
}
if (_keys != null && _keys.Length > 0)
{
args.Add("INKEYS".Literal());
args.Add(_keys.Length.Boxed());
foreach (var key in _keys)
{
args.Add(key);
}
}
if (_returnFields != null && _returnFields.Length > 0)
{
args.Add("RETURN".Literal());
args.Add(_returnFields.Length.Boxed());
foreach (var returnField in _returnFields)
{
args.Add(returnField);
}
}
}
/// <summary>
......@@ -340,7 +361,7 @@ public Query LimitFields(params string[] fields)
/// <summary>
/// Limit the query to results that are limited to a specific set of keys
/// </summary>
/// <param name="fields">fields a list of TEXT fields in the schemas</param>
/// <param name="keys">a list of the TEXT fields in the schemas</param>
/// <returns>the query object itself</returns>
public Query LimitKeys(params string[] keys)
{
......
// .NET port of https://github.com/RedisLabs/JRediSearch/
using System;
using StackExchange.Redis;
namespace NRediSearch
{
public sealed class Suggestion
{
public string String { get; }
public double Score { get; }
public string Payload { get; }
private Suggestion(SuggestionBuilder builder)
{
String = builder._string;
Score = builder._score;
Payload = builder._payload;
}
public override bool Equals(object obj)
{
if (this == obj)
{
return true;
}
if(!(obj is Suggestion that))
{
return false;
}
return Score == that.Score && String == that.String && Payload == that.Payload;
}
public override int GetHashCode()
{
unchecked
{
int hash = 17;
hash = hash * 31 + String.GetHashCode();
hash = hash * 31 + Score.GetHashCode();
hash = hash * 31 + Payload.GetHashCode();
return hash;
}
}
public override string ToString() =>
$"Suggestion{{string='{String}', score={Score}, payload='{Payload}'}}";
public SuggestionBuilder ToBuilder() => new SuggestionBuilder(this);
public static SuggestionBuilder Builder => new SuggestionBuilder();
public sealed class SuggestionBuilder
{
internal string _string;
internal double _score = 1.0;
internal string _payload;
public SuggestionBuilder() { }
public SuggestionBuilder(Suggestion suggestion)
{
_string = suggestion.String;
_score = suggestion.Score;
_payload = suggestion.Payload;
}
public SuggestionBuilder String(string @string)
{
_string = @string;
return this;
}
public SuggestionBuilder Score(double score)
{
_score = score;
return this;
}
public SuggestionBuilder Payload(string payload)
{
_payload = payload;
return this;
}
public Suggestion Build()
{
bool isStringMissing = _string == null;
bool isScoreOutOfRange = (_score < 0.0 || _score > 1.0);
if (isStringMissing || isScoreOutOfRange)
{
throw new RedisCommandException($"Missing required fields: {(isStringMissing ? "string" : string.Empty)} {(isScoreOutOfRange ? "score not within range" : string.Empty)}");
}
return new Suggestion(this);
}
}
}
}
// .NET port of https://github.com/RedisLabs/JRediSearch/
using System;
namespace NRediSearch
{
public class SuggestionOptions
{
private readonly object WITHPAYLOADS_FLAG = "WITHPAYLOADS".Literal();
private readonly object WITHSCORES_FLAG = "WITHSCORES".Literal();
public SuggestionOptions(SuggestionOptionsBuilder builder)
{
With = builder._with;
Fuzzy = builder._fuzzy;
Max = builder._max;
}
public static SuggestionOptionsBuilder Builder => new SuggestionOptionsBuilder();
public WithOptions With { get; }
public bool Fuzzy { get; }
public int Max { get; } = 5;
public object[] GetFlags()
{
if (HasOption(WithOptions.PayloadsAndScores))
{
return new[] { WITHPAYLOADS_FLAG, WITHSCORES_FLAG };
}
if (HasOption(WithOptions.Payloads))
{
return new[] { WITHPAYLOADS_FLAG };
}
if (HasOption(WithOptions.Scores))
{
return new[] { WITHSCORES_FLAG };
}
return default;
}
public SuggestionOptionsBuilder ToBuilder() => new SuggestionOptionsBuilder(this);
internal bool GetIsPayloadAndScores() => HasOption(WithOptions.PayloadsAndScores);
internal bool GetIsPayload() => HasOption(WithOptions.Payloads);
internal bool GetIsScores() => HasOption(WithOptions.Scores);
[Flags]
public enum WithOptions
{
None = 0,
Payloads = 1,
Scores = 2,
PayloadsAndScores = Payloads | Scores
}
internal bool HasOption(WithOptions option) => (With & option) != 0;
public sealed class SuggestionOptionsBuilder
{
internal WithOptions _with;
internal bool _fuzzy;
internal int _max = 5;
public SuggestionOptionsBuilder() { }
public SuggestionOptionsBuilder(SuggestionOptions options)
{
_with = options.With;
_fuzzy = options.Fuzzy;
_max = options.Max;
}
public SuggestionOptionsBuilder Fuzzy()
{
_fuzzy = true;
return this;
}
public SuggestionOptionsBuilder Max(int max)
{
_max = max;
return this;
}
public SuggestionOptionsBuilder With(WithOptions with)
{
_with = with;
return this;
}
public SuggestionOptions Build()
{
return new SuggestionOptions(this);
}
}
}
}
using System.Threading;
using NRediSearch.Aggregation;
using NRediSearch.Aggregation.Reducers;
using StackExchange.Redis;
using Xunit;
using Xunit.Abstractions;
using static NRediSearch.Client;
namespace NRediSearch.Test.ClientTests
{
public class AggregationBuilderTests : RediSearchTestBase
{
public AggregationBuilderTests(ITestOutputHelper output) : base(output)
{
}
[Fact]
public void TestAggregations()
{
/**
127.0.0.1:6379> FT.CREATE test_index SCHEMA name TEXT SORTABLE count NUMERIC SORTABLE
OK
127.0.0.1:6379> FT.ADD test_index data1 1.0 FIELDS name abc count 10
OK
127.0.0.1:6379> FT.ADD test_index data2 1.0 FIELDS name def count 5
OK
127.0.0.1:6379> FT.ADD test_index data3 1.0 FIELDS name def count 25
*/
Client cl = GetClient();
Schema sc = new Schema();
sc.AddSortableTextField("name", 1.0);
sc.AddSortableNumericField("count");
cl.CreateIndex(sc, new ConfiguredIndexOptions());
cl.AddDocument(new Document("data1").Set("name", "abc").Set("count", 10));
cl.AddDocument(new Document("data2").Set("name", "def").Set("count", 5));
cl.AddDocument(new Document("data3").Set("name", "def").Set("count", 25));
AggregationBuilder r = new AggregationBuilder()
.GroupBy("@name", Reducers.Sum("@count").As("sum"))
.SortBy(10, SortedField.Descending("@sum"));
// actual search
AggregationResult res = cl.Aggregate(r);
Row? r1 = res.GetRow(0);
Assert.NotNull(r1);
Assert.Equal("def", r1.Value.GetString("name"));
Assert.Equal(30, r1.Value.GetInt64("sum"));
Assert.Equal(30.0, r1.Value.GetDouble("sum"));
Assert.Equal(0L, r1.Value.GetInt64("nosuchcol"));
Assert.Equal(0.0, r1.Value.GetDouble("nosuchcol"));
Assert.Null(r1.Value.GetString("nosuchcol"));
Row? r2 = res.GetRow(1);
Assert.NotNull(r2);
Assert.Equal("abc", r2.Value.GetString("name"));
Assert.Equal(10L, r2.Value.GetInt64("sum"));
}
[Fact]
public void TestApplyAndFilterAggregations()
{
/**
127.0.0.1:6379> FT.CREATE test_index SCHEMA name TEXT SORTABLE subj1 NUMERIC SORTABLE subj2 NUMERIC SORTABLE
OK
127.0.0.1:6379> FT.ADD test_index data1 1.0 FIELDS name abc subj1 20 subj2 70
OK
127.0.0.1:6379> FT.ADD test_index data2 1.0 FIELDS name def subj1 60 subj2 40
OK
127.0.0.1:6379> FT.ADD test_index data3 1.0 FIELDS name ghi subj1 50 subj2 80
OK
127.0.0.1:6379> FT.ADD test_index data1 1.0 FIELDS name abc subj1 30 subj2 20
OK
127.0.0.1:6379> FT.ADD test_index data2 1.0 FIELDS name def subj1 65 subj2 45
OK
127.0.0.1:6379> FT.ADD test_index data3 1.0 FIELDS name ghi subj1 70 subj2 70
OK
*/
Client cl = GetClient();
Schema sc = new Schema();
sc.AddSortableTextField("name", 1.0);
sc.AddSortableNumericField("subj1");
sc.AddSortableNumericField("subj2");
cl.CreateIndex(sc, new ConfiguredIndexOptions());
cl.AddDocument(new Document("data1").Set("name", "abc").Set("subj1", 20).Set("subj2", 70));
cl.AddDocument(new Document("data2").Set("name", "def").Set("subj1", 60).Set("subj2", 40));
cl.AddDocument(new Document("data3").Set("name", "ghi").Set("subj1", 50).Set("subj2", 80));
cl.AddDocument(new Document("data4").Set("name", "abc").Set("subj1", 30).Set("subj2", 20));
cl.AddDocument(new Document("data5").Set("name", "def").Set("subj1", 65).Set("subj2", 45));
cl.AddDocument(new Document("data6").Set("name", "ghi").Set("subj1", 70).Set("subj2", 70));
AggregationBuilder r = new AggregationBuilder().Apply("(@subj1+@subj2)/2", "attemptavg")
.GroupBy("@name", Reducers.Avg("@attemptavg").As("avgscore"))
.Filter("@avgscore>=50")
.SortBy(10, SortedField.Ascending("@name"));
// actual search
AggregationResult res = cl.Aggregate(r);
Row? r1 = res.GetRow(0);
Assert.NotNull(r1);
Assert.Equal("def", r1.Value.GetString("name"));
Assert.Equal(52.5, r1.Value.GetDouble("avgscore"));
Row? r2 = res.GetRow(1);
Assert.NotNull(r2);
Assert.Equal("ghi", r2.Value.GetString("name"));
Assert.Equal(67.5, r2.Value.GetDouble("avgscore"));
}
[Fact]
public void TestCursor()
{
/**
127.0.0.1:6379> FT.CREATE test_index SCHEMA name TEXT SORTABLE count NUMERIC SORTABLE
OK
127.0.0.1:6379> FT.ADD test_index data1 1.0 FIELDS name abc count 10
OK
127.0.0.1:6379> FT.ADD test_index data2 1.0 FIELDS name def count 5
OK
127.0.0.1:6379> FT.ADD test_index data3 1.0 FIELDS name def count 25
*/
Client cl = GetClient();
Schema sc = new Schema();
sc.AddSortableTextField("name", 1.0);
sc.AddSortableNumericField("count");
cl.CreateIndex(sc, new ConfiguredIndexOptions());
cl.AddDocument(new Document("data1").Set("name", "abc").Set("count", 10));
cl.AddDocument(new Document("data2").Set("name", "def").Set("count", 5));
cl.AddDocument(new Document("data3").Set("name", "def").Set("count", 25));
AggregationBuilder r = new AggregationBuilder()
.GroupBy("@name", Reducers.Sum("@count").As("sum"))
.SortBy(10, SortedField.Descending("@sum"))
.Cursor(1, 3000);
// actual search
AggregationResult res = cl.Aggregate(r);
Row? row = res.GetRow(0);
Assert.NotNull(row);
Assert.Equal("def", row.Value.GetString("name"));
Assert.Equal(30, row.Value.GetInt64("sum"));
Assert.Equal(30.0, row.Value.GetDouble("sum"));
Assert.Equal(0L, row.Value.GetInt64("nosuchcol"));
Assert.Equal(0.0, row.Value.GetDouble("nosuchcol"));
Assert.Null(row.Value.GetString("nosuchcol"));
res = cl.CursorRead(res.CursorId, 1);
Row? row2 = res.GetRow(0);
Assert.NotNull(row2);
Assert.Equal("abc", row2.Value.GetString("name"));
Assert.Equal(10, row2.Value.GetInt64("sum"));
Assert.True(cl.CursorDelete(res.CursorId));
try
{
cl.CursorRead(res.CursorId, 1);
Assert.True(false);
}
catch (RedisException) { }
AggregationBuilder r2 = new AggregationBuilder()
.GroupBy("@name", Reducers.Sum("@count").As("sum"))
.SortBy(10, SortedField.Descending("@sum"))
.Cursor(1, 1000);
Thread.Sleep(1000);
try
{
cl.CursorRead(res.CursorId, 1);
Assert.True(false);
}
catch (RedisException) { }
}
}
}
using NRediSearch.Aggregation;
using System;
using NRediSearch.Aggregation;
using NRediSearch.Aggregation.Reducers;
using Xunit;
using Xunit.Abstractions;
using static NRediSearch.Client;
namespace NRediSearch.Test.ClientTests
{
......@@ -10,6 +12,7 @@ public class AggregationTest : RediSearchTestBase
public AggregationTest(ITestOutputHelper output) : base(output) { }
[Fact]
[Obsolete]
public void TestAggregations()
{
/**
......@@ -26,7 +29,7 @@ public void TestAggregations()
Schema sc = new Schema();
sc.AddSortableTextField("name", 1.0);
sc.AddSortableNumericField("count");
cl.CreateIndex(sc, Client.IndexOptions.Default);
cl.CreateIndex(sc, new ConfiguredIndexOptions());
cl.AddDocument(new Document("data1").Set("name", "abc").Set("count", 10));
cl.AddDocument(new Document("data2").Set("name", "def").Set("count", 5));
cl.AddDocument(new Document("data3").Set("name", "def").Set("count", 25));
......
using System;
using System.Collections.Generic;
using System.Collections.Generic;
using System.Text;
using StackExchange.Redis;
using Xunit;
using Xunit.Abstractions;
using static NRediSearch.Client;
using static NRediSearch.Schema;
using static NRediSearch.SuggestionOptions;
namespace NRediSearch.Test.ClientTests
{
......@@ -18,7 +20,7 @@ public void Search()
Schema sc = new Schema().AddTextField("title", 1.0).AddTextField("body", 1.0);
Assert.True(cl.CreateIndex(sc, Client.IndexOptions.Default));
Assert.True(cl.CreateIndex(sc, new ConfiguredIndexOptions()));
var fields = new Dictionary<string, RedisValue>
{
{ "title", "hello world" },
......@@ -58,7 +60,7 @@ public void TestNumericFilter()
Schema sc = new Schema().AddTextField("title", 1.0).AddNumericField("price");
Assert.True(cl.CreateIndex(sc, Client.IndexOptions.Default));
Assert.True(cl.CreateIndex(sc, new ConfiguredIndexOptions()));
for (int i = 0; i < 100; i++)
{
......@@ -120,8 +122,7 @@ public void TestStopwords()
Schema sc = new Schema().AddTextField("title", 1.0);
Assert.True(cl.CreateIndex(sc,
Client.IndexOptions.Default.SetStopwords("foo", "bar", "baz")));
Assert.True(cl.CreateIndex(sc, new ConfiguredIndexOptions().SetStopwords("foo", "bar", "baz")));
var fields = new Dictionary<string, RedisValue>
{
......@@ -135,8 +136,7 @@ public void TestStopwords()
Reset(cl);
Assert.True(cl.CreateIndex(sc,
Client.IndexOptions.Default | Client.IndexOptions.DisableStopWords));
Assert.True(cl.CreateIndex(sc, new ConfiguredIndexOptions().SetNoStopwords()));
fields = new Dictionary<string, RedisValue>
{
{ "title", "hello world foo bar to be or not to be" }
......@@ -155,7 +155,7 @@ public void TestGeoFilter()
Schema sc = new Schema().AddTextField("title", 1.0).AddGeoField("loc");
Assert.True(cl.CreateIndex(sc, Client.IndexOptions.Default));
Assert.True(cl.CreateIndex(sc, new ConfiguredIndexOptions()));
var fields = new Dictionary<string, RedisValue>
{
{ "title", "hello world" },
......@@ -188,7 +188,7 @@ public void TestPayloads()
Schema sc = new Schema().AddTextField("title", 1.0);
Assert.True(cl.CreateIndex(sc, Client.IndexOptions.Default));
Assert.True(cl.CreateIndex(sc, new ConfiguredIndexOptions()));
var fields = new Dictionary<string, RedisValue>
{
......@@ -210,7 +210,7 @@ public void TestQueryFlags()
Schema sc = new Schema().AddTextField("title", 1.0);
Assert.True(cl.CreateIndex(sc, Client.IndexOptions.Default));
Assert.True(cl.CreateIndex(sc, new ConfiguredIndexOptions()));
var fields = new Dictionary<string, RedisValue>();
for (int i = 0; i < 100; i++)
......@@ -261,7 +261,7 @@ public void TestSortQueryFlags()
Client cl = GetClient();
Schema sc = new Schema().AddSortableTextField("title", 1.0);
Assert.True(cl.CreateIndex(sc, Client.IndexOptions.Default));
Assert.True(cl.CreateIndex(sc, new ConfiguredIndexOptions()));
var fields = new Dictionary<string, RedisValue>
{
["title"] = "b title"
......@@ -294,7 +294,7 @@ public void TestAddHash()
Client cl = GetClient();
Schema sc = new Schema().AddTextField("title", 1.0);
Assert.True(cl.CreateIndex(sc, Client.IndexOptions.Default));
Assert.True(cl.CreateIndex(sc, new ConfiguredIndexOptions()));
RedisKey hashKey = (string)cl.IndexName + ":foo";
Db.KeyDelete(hashKey);
Db.HashSet(hashKey, "title", "hello world");
......@@ -313,7 +313,7 @@ public void TestDrop()
Schema sc = new Schema().AddTextField("title", 1.0);
Assert.True(cl.CreateIndex(sc, Client.IndexOptions.Default));
Assert.True(cl.CreateIndex(sc, new ConfiguredIndexOptions()));
var fields = new Dictionary<string, RedisValue>
{
{ "title", "hello world" }
......@@ -335,13 +335,50 @@ public void TestDrop()
Assert.Null(key);
}
[Fact]
public void TestAlterAdd()
{
Client cl = GetClient();
Db.Execute("FLUSHDB"); // YEAH, this is still horrible and I'm still dealing with it.
Schema sc = new Schema().AddTextField("title", 1.0);
Assert.True(cl.CreateIndex(sc, new ConfiguredIndexOptions()));
var fields = new Dictionary<string, RedisValue>();
fields.Add("title", "hello world");
for (int i = 0; i < 100; i++)
{
Assert.True(cl.AddDocument($"doc{i}", fields));
}
SearchResult res = cl.Search(new Query("hello world"));
Assert.Equal(100, res.TotalResults);
Assert.True(cl.AlterIndex(new TagField("tags", ","), new TextField("name", 0.5)));
for (int i = 0; i < 100; i++)
{
var fields2 = new Dictionary<string, RedisValue>();
fields2.Add("name", $"name{i}");
fields2.Add("tags", $"tagA,tagB,tag{i}");
Assert.True(cl.UpdateDocument($"doc{i}", fields2, 1.0));
}
SearchResult res2 = cl.Search(new Query("@tags:{tagA}"));
Assert.Equal(100, res2.TotalResults);
var info = cl.GetInfoParsed();
Assert.Equal(cl.IndexName, info.IndexName);
Assert.True(info.Fields.ContainsKey("tags"));
Assert.Equal("TAG", (string)info.Fields["tags"][2]);
}
[Fact]
public void TestNoStem()
{
Client cl = GetClient();
Schema sc = new Schema().AddTextField("stemmed", 1.0).AddField(new Schema.TextField("notStemmed", 1.0, false, true));
Assert.True(cl.CreateIndex(sc, Client.IndexOptions.Default));
Schema sc = new Schema().AddTextField("stemmed", 1.0).AddField(new TextField("notStemmed", 1.0, false, true));
Assert.True(cl.CreateIndex(sc, new ConfiguredIndexOptions()));
var doc = new Dictionary<string, RedisValue>
{
......@@ -359,16 +396,28 @@ public void TestNoStem()
Assert.Equal(0, res.TotalResults);
}
[Fact]
public void TestInfoParsed()
{
Client cl = GetClient();
Schema sc = new Schema().AddTextField("title", 1.0);
Assert.True(cl.CreateIndex(sc, new ConfiguredIndexOptions()));
var info = cl.GetInfoParsed();
Assert.Equal(cl.IndexName, info.IndexName);
}
[Fact]
public void TestInfo()
{
Client cl = GetClient();
Schema sc = new Schema().AddTextField("title", 1.0);
Assert.True(cl.CreateIndex(sc, Client.IndexOptions.Default));
Assert.True(cl.CreateIndex(sc, new ConfiguredIndexOptions()));
var info = cl.GetInfo();
Assert.Equal((string)cl.IndexName, (string)info["index_name"]);
Assert.Equal(cl.IndexName, info["index_name"]);
}
[Fact]
......@@ -377,9 +426,9 @@ public void TestNoIndex()
Client cl = GetClient();
Schema sc = new Schema()
.AddField(new Schema.TextField("f1", 1.0, true, false, true))
.AddField(new Schema.TextField("f2", 1.0));
cl.CreateIndex(sc, Client.IndexOptions.Default);
.AddField(new TextField("f1", 1.0, true, false, true))
.AddField(new TextField("f2", 1.0));
cl.CreateIndex(sc, new ConfiguredIndexOptions());
var mm = new Dictionary<string, RedisValue>
{
......@@ -417,7 +466,7 @@ public void TestReplacePartial()
.AddTextField("f1", 1.0)
.AddTextField("f2", 1.0)
.AddTextField("f3", 1.0);
cl.CreateIndex(sc, Client.IndexOptions.Default);
cl.CreateIndex(sc, new ConfiguredIndexOptions());
var mm = new Dictionary<string, RedisValue>
{
......@@ -451,7 +500,7 @@ public void TestExplain()
.AddTextField("f1", 1.0)
.AddTextField("f2", 1.0)
.AddTextField("f3", 1.0);
cl.CreateIndex(sc, Client.IndexOptions.Default);
cl.CreateIndex(sc, new ConfiguredIndexOptions());
var res = cl.Explain(new Query("@f3:f3_val @f2:f2_val @f1:f1_val"));
Assert.NotNull(res);
......@@ -464,7 +513,7 @@ public void TestHighlightSummarize()
{
Client cl = GetClient();
Schema sc = new Schema().AddTextField("text", 1.0);
cl.CreateIndex(sc, Client.IndexOptions.Default);
cl.CreateIndex(sc, new ConfiguredIndexOptions());
var doc = new Dictionary<string, RedisValue>
{
......@@ -477,6 +526,12 @@ public void TestHighlightSummarize()
Assert.Equal("is often referred as a <b>data</b> structures server. What this means is that Redis provides... What this means is that Redis provides access to mutable <b>data</b> structures via a set of commands, which are sent using a... So different processes can query and modify the same <b>data</b> structures in a shared... ",
res.Documents[0]["text"]);
q = new Query("data").HighlightFields(new Query.HighlightTags("<u>", "</u>")).SummarizeFields();
res = cl.Search(q);
Assert.Equal("is often referred as a <u>data</u> structures server. What this means is that Redis provides... What this means is that Redis provides access to mutable <u>data</u> structures via a set of commands, which are sent using a... So different processes can query and modify the same <u>data</u> structures in a shared... ",
res.Documents[0]["text"]);
}
[Fact]
......@@ -484,7 +539,7 @@ public void TestLanguage()
{
Client cl = GetClient();
Schema sc = new Schema().AddTextField("text", 1.0);
cl.CreateIndex(sc, Client.IndexOptions.Default);
cl.CreateIndex(sc, new ConfiguredIndexOptions());
Document d = new Document("doc1").Set("text", "hello");
AddOptions options = new AddOptions().SetLanguage("spanish");
......@@ -494,7 +549,7 @@ public void TestLanguage()
cl.DeleteDocument(d.Id);
var ex = Assert.Throws<RedisServerException>(() => cl.AddDocument(d, options));
Assert.Equal("Unsupported Language", ex.Message);
Assert.Equal("Unsupported language", ex.Message);
}
[Fact]
......@@ -509,7 +564,7 @@ public void TestDropMissing()
public void TestGet()
{
Client cl = GetClient();
cl.CreateIndex(new Schema().AddTextField("txt1", 1.0), Client.IndexOptions.Default);
cl.CreateIndex(new Schema().AddTextField("txt1", 1.0), new ConfiguredIndexOptions());
cl.AddDocument(new Document("doc1").Set("txt1", "Hello World!"), new AddOptions());
Document d = cl.GetDocument("doc1");
Assert.NotNull(d);
......@@ -518,5 +573,288 @@ public void TestGet()
// Get something that does not exist. Shouldn't explode
Assert.Null(cl.GetDocument("nonexist"));
}
[Fact]
public void TestMGet()
{
Client cl = GetClient();
Db.Execute("FLUSHDB"); // YEAH, this is still horrible and I'm still dealing with it.
cl.CreateIndex(new Schema().AddTextField("txt1", 1.0), new ConfiguredIndexOptions());
cl.AddDocument(new Document("doc1").Set("txt1", "Hello World!1"), new AddOptions());
cl.AddDocument(new Document("doc2").Set("txt1", "Hello World!2"), new AddOptions());
cl.AddDocument(new Document("doc3").Set("txt1", "Hello World!3"), new AddOptions());
var docs = cl.GetDocuments();
Assert.Empty(docs);
docs = cl.GetDocuments("doc1", "doc3", "doc4");
Assert.Equal(3, docs.Length);
Assert.Equal("Hello World!1", docs[0]["txt1"]);
Assert.Equal("Hello World!3", docs[1]["txt1"]);
Assert.Null(docs[2]);
}
[Fact]
public void TestAddSuggestionGetSuggestionFuzzy()
{
Client cl = GetClient();
Suggestion suggestion = Suggestion.Builder.String("TOPIC OF WORDS").Score(1).Build();
// test can add a suggestion string
Assert.True(cl.AddSuggestion(suggestion, true) > 0, $"{suggestion} insert should of returned at least 1");
// test that the partial part of that string will be returned using fuzzy
//Assert.Equal(suggestion.ToString() + " suppose to be returned", suggestion, cl.GetSuggestion(suggestion.String.Substring(0, 3), SuggestionOptions.GetBuilder().Build()).get(0));
Assert.Equal(suggestion.ToString(), cl.GetSuggestions(suggestion.String.Substring(0, 3), SuggestionOptions.Builder.Build())[0].ToString());
}
[Fact]
public void TestAddSuggestionGetSuggestion()
{
Client cl = GetClient();
Suggestion suggestion = Suggestion.Builder.String("ANOTHER_WORD").Score(1).Build();
Suggestion noMatch = Suggestion.Builder.String("_WORD MISSED").Score(1).Build();
Assert.True(cl.AddSuggestion(suggestion, false) > 0, $"{suggestion.ToString()} should of inserted at least 1");
Assert.True(cl.AddSuggestion(noMatch, false) > 0, $"{noMatch.ToString()} should of inserted at least 1");
// test that with a partial part of that string will have the entire word returned SuggestionOptions.builder().build()
Assert.Single(cl.GetSuggestions(suggestion.String.Substring(0, 3), SuggestionOptions.Builder.Fuzzy().Build()));
// turn off fuzzy start at second word no hit
Assert.Empty(cl.GetSuggestions(noMatch.String.Substring(1, 6), SuggestionOptions.Builder.Build()));
// my attempt to trigger the fuzzy by 1 character
Assert.Single(cl.GetSuggestions(noMatch.String.Substring(1, 6), SuggestionOptions.Builder.Fuzzy().Build()));
}
[Fact]
public void TestAddSuggestionGetSuggestionPayloadScores()
{
Client cl = GetClient();
Suggestion suggestion = Suggestion.Builder.String("COUNT_ME TOO").Payload("PAYLOADS ROCK ").Score(0.2).Build();
Assert.True(cl.AddSuggestion(suggestion, false) > 0, $"{suggestion.ToString()} insert should of at least returned 1");
Assert.True(cl.AddSuggestion(suggestion.ToBuilder().String("COUNT").Payload("My PAYLOAD is better").Build(), false) > 1, "Count single added should return more than 1");
Assert.True(cl.AddSuggestion(suggestion.ToBuilder().String("COUNT_ANOTHER").Score(1).Payload(null).Build(), false) > 1, "Count single added should return more than 1");
Suggestion noScoreOrPayload = Suggestion.Builder.String("COUNT NO PAYLOAD OR COUNT").Build();
Assert.True(cl.AddSuggestion(noScoreOrPayload, true) > 1, "Count single added should return more than 1");
var payloads = cl.GetSuggestions(suggestion.String.Substring(0, 3), SuggestionOptions.Builder.With(WithOptions.PayloadsAndScores).Build());
Assert.Equal(4, payloads.Length);
Assert.True(payloads[2].Payload.Length > 0);
Assert.True(payloads[1].Score < .299);
}
[Fact]
public void TestAddSuggestionGetSuggestionPayload()
{
Client cl = GetClient();
cl.AddSuggestion(Suggestion.Builder.String("COUNT_ME TOO").Payload("PAYLOADS ROCK ").Build(), false);
cl.AddSuggestion(Suggestion.Builder.String("COUNT").Payload("ANOTHER PAYLOAD ").Build(), false);
cl.AddSuggestion(Suggestion.Builder.String("COUNTNO PAYLOAD OR COUNT").Build(), false);
// test that with a partial part of that string will have the entire word returned
var payloads = cl.GetSuggestions("COU", SuggestionOptions.Builder.Max(3).Fuzzy().With(WithOptions.Payloads).Build());
Assert.Equal(3, payloads.Length);
}
[Fact]
public void TestGetSuggestionNoPayloadTwoOnly()
{
Client cl = GetClient();
cl.AddSuggestion(Suggestion.Builder.String("DIFF_WORD").Score(0.4).Payload("PAYLOADS ROCK ").Build(), false);
cl.AddSuggestion(Suggestion.Builder.String("DIFF wording").Score(0.5).Payload("ANOTHER PAYLOAD ").Build(), false);
cl.AddSuggestion(Suggestion.Builder.String("DIFFERENT").Score(0.7).Payload("I am a payload").Build(), false);
var payloads = cl.GetSuggestions("DIF", SuggestionOptions.Builder.Max(2).Build());
Assert.Equal(2, payloads.Length);
var three = cl.GetSuggestions("DIF", SuggestionOptions.Builder.Max(3).Build());
Assert.Equal(3, three.Length);
}
[Fact]
public void TestGetSuggestionsAsStringArray()
{
Client cl = GetClient();
cl.AddSuggestion(Suggestion.Builder.String("DIFF_WORD").Score(0.4).Payload("PAYLOADS ROCK ").Build(), false);
cl.AddSuggestion(Suggestion.Builder.String("DIFF wording").Score(0.5).Payload("ANOTHER PAYLOAD ").Build(), false);
cl.AddSuggestion(Suggestion.Builder.String("DIFFERENT").Score(0.7).Payload("I am a payload").Build(), false);
var payloads = cl.GetSuggestions("DIF", max: 2);
Assert.Equal(2, payloads.Length);
var three = cl.GetSuggestions("DIF", max: 3);
Assert.Equal(3, three.Length);
}
[Fact]
public void TestGetSuggestionWithScore()
{
Client cl = GetClient();
cl.AddSuggestion(Suggestion.Builder.String("DIFF_WORD").Score(0.4).Payload("PAYLOADS ROCK ").Build(), true);
var list = cl.GetSuggestions("DIF", SuggestionOptions.Builder.Max(2).With(WithOptions.Scores).Build());
Assert.True(list[0].Score <= .2);
}
[Fact]
public void TestGetSuggestionAllNoHit()
{
Client cl = GetClient();
cl.AddSuggestion(Suggestion.Builder.String("NO WORD").Score(0.4).Build(), false);
var none = cl.GetSuggestions("DIF", SuggestionOptions.Builder.Max(3).With(WithOptions.Scores).Build());
Assert.Empty(none);
}
[Fact]
public void TestGetTagField()
{
Client cl = GetClient();
Schema sc = new Schema()
.AddTextField("title", 1.0)
.AddTagField("category");
Assert.True(cl.CreateIndex(sc, new ConfiguredIndexOptions()));
var fields1 = new Dictionary<string, RedisValue>();
fields1.Add("title", "hello world");
fields1.Add("category", "red");
Assert.True(cl.AddDocument("foo", fields1));
var fields2 = new Dictionary<string, RedisValue>();
fields2.Add("title", "hello world");
fields2.Add("category", "blue");
Assert.True(cl.AddDocument("bar", fields2));
var fields3 = new Dictionary<string, RedisValue>();
fields3.Add("title", "hello world");
fields3.Add("category", "green,yellow");
Assert.True(cl.AddDocument("baz", fields3));
var fields4 = new Dictionary<string, RedisValue>();
fields4.Add("title", "hello world");
fields4.Add("category", "orange;purple");
Assert.True(cl.AddDocument("qux", fields4));
Assert.Equal(1, cl.Search(new Query("@category:{red}")).TotalResults);
Assert.Equal(1, cl.Search(new Query("@category:{blue}")).TotalResults);
Assert.Equal(1, cl.Search(new Query("hello @category:{red}")).TotalResults);
Assert.Equal(1, cl.Search(new Query("hello @category:{blue}")).TotalResults);
Assert.Equal(1, cl.Search(new Query("@category:{yellow}")).TotalResults);
Assert.Equal(0, cl.Search(new Query("@category:{purple}")).TotalResults);
Assert.Equal(1, cl.Search(new Query("@category:{orange\\;purple}")).TotalResults);
Assert.Equal(4, cl.Search(new Query("hello")).TotalResults);
}
[Fact]
public void TestGetTagFieldWithNonDefaultSeparator()
{
Client cl = GetClient();
Schema sc = new Schema()
.AddTextField("title", 1.0)
.AddTagField("category", ";");
Assert.True(cl.CreateIndex(sc, new ConfiguredIndexOptions()));
var fields1 = new Dictionary<string, RedisValue>();
fields1.Add("title", "hello world");
fields1.Add("category", "red");
Assert.True(cl.AddDocument("foo", fields1));
var fields2 = new Dictionary<string, RedisValue>();
fields2.Add("title", "hello world");
fields2.Add("category", "blue");
Assert.True(cl.AddDocument("bar", fields2));
var fields3 = new Dictionary<string, RedisValue>();
fields3.Add("title", "hello world");
fields3.Add("category", "green;yellow");
Assert.True(cl.AddDocument("baz", fields3));
var fields4 = new Dictionary<string, RedisValue>();
fields4.Add("title", "hello world");
fields4.Add("category", "orange,purple");
Assert.True(cl.AddDocument("qux", fields4));
Assert.Equal(1, cl.Search(new Query("@category:{red}")).TotalResults);
Assert.Equal(1, cl.Search(new Query("@category:{blue}")).TotalResults);
Assert.Equal(1, cl.Search(new Query("hello @category:{red}")).TotalResults);
Assert.Equal(1, cl.Search(new Query("hello @category:{blue}")).TotalResults);
Assert.Equal(1, cl.Search(new Query("hello @category:{yellow}")).TotalResults);
Assert.Equal(0, cl.Search(new Query("@category:{purple}")).TotalResults);
Assert.Equal(1, cl.Search(new Query("@category:{orange\\,purple}")).TotalResults);
Assert.Equal(4, cl.Search(new Query("hello")).TotalResults);
}
[Fact]
public void TestMultiDocuments()
{
Client cl = GetClient();
Schema sc = new Schema().AddTextField("title", 1.0).AddTextField("body", 1.0);
Assert.True(cl.CreateIndex(sc, new ConfiguredIndexOptions()));
var fields = new Dictionary<string, RedisValue>();
fields.Add("title", "hello world");
fields.Add("body", "lorem ipsum");
var results = cl.AddDocuments(new Document("doc1", fields), new Document("doc2", fields), new Document("doc3", fields));
Assert.Equal(new[] { true, true, true }, results);
Assert.Equal(3, cl.Search(new Query("hello world")).TotalResults);
results = cl.AddDocuments(new Document("doc4", fields), new Document("doc2", fields), new Document("doc5", fields));
Assert.Equal(new[] { true, false, true }, results);
results = cl.DeleteDocuments(true, "doc1", "doc2", "doc36");
Assert.Equal(new[] { true, true, false }, results);
}
[Fact]
public void TestReturnFields()
{
Client cl = GetClient();
Db.Execute("FLUSHDB");
Schema sc = new Schema().AddTextField("field1", 1.0).AddTextField("field2", 1.0);
Assert.True(cl.CreateIndex(sc, new ConfiguredIndexOptions()));
var doc = new Dictionary<string, RedisValue>();
doc.Add("field1", "value1");
doc.Add("field2", "value2");
// Store it
Assert.True(cl.AddDocument("doc", doc));
// Query
SearchResult res = cl.Search(new Query("*").ReturnFields("field1"));
Assert.Equal(1, res.TotalResults);
Assert.Equal("value1", res.Documents[0]["field1"]);
Assert.Null((string)res.Documents[0]["field2"]);
}
[Fact]
public void TestInKeys()
{
Client cl = GetClient();
Db.Execute("FLUSHDB");
Schema sc = new Schema().AddTextField("field1", 1.0).AddTextField("field2", 1.0);
Assert.True(cl.CreateIndex(sc, new ConfiguredIndexOptions()));
var doc = new Dictionary<string, RedisValue>();
doc.Add("field1", "value");
doc.Add("field2", "not");
// Store it
Assert.True(cl.AddDocument("doc1", doc));
Assert.True(cl.AddDocument("doc2", doc));
// Query
SearchResult res = cl.Search(new Query("value").LimitKeys("doc1"));
Assert.Equal(1, res.TotalResults);
Assert.Equal("doc1", res.Documents[0].Id);
Assert.Equal("value", res.Documents[0]["field1"]);
Assert.Null((string)res.Documents[0]["value"]);
}
}
}
......@@ -4,6 +4,7 @@
using StackExchange.Redis;
using Xunit;
using Xunit.Abstractions;
using static NRediSearch.Client;
namespace NRediSearch.Test
{
......@@ -27,7 +28,7 @@ public void BasicUsage()
bool result = false;
try
{
result = client.CreateIndex(sc, Client.IndexOptions.Default);
result = client.CreateIndex(sc, new ConfiguredIndexOptions());
}
catch (RedisServerException ex)
{
......
using NRediSearch.Aggregation;
using static NRediSearch.QueryBuilder.QueryBuilder;
using static NRediSearch.QueryBuilder.Values;
using static NRediSearch.Aggregation.Reducers.Reducers;
using static NRediSearch.Aggregation.SortedField;
using Xunit;
using Xunit.Abstractions;
using System;
using System.Collections.Generic;
using NRediSearch.Aggregation;
using NRediSearch.QueryBuilder;
using System;
using StackExchange.Redis;
using System.Collections.Generic;
using Xunit;
using Xunit.Abstractions;
using static NRediSearch.Aggregation.Reducers.Reducers;
using static NRediSearch.Aggregation.SortedField;
using static NRediSearch.QueryBuilder.QueryBuilder;
using static NRediSearch.QueryBuilder.Values;
namespace NRediSearch.Test.QueryBuilder
{
......@@ -88,7 +88,7 @@ public void TestAggregation()
{
Assert.Equal("*", GetArgsString(new AggregationRequest()));
AggregationRequest r = new AggregationRequest().
GroupBy("@actor", Count().As ("cnt")).
GroupBy("@actor", Count().As("cnt")).
SortBy(Descending("@cnt"));
Assert.Equal("* GROUPBY 1 @actor REDUCE COUNT 0 AS cnt SORTBY 2 @cnt DESC", GetArgsString(r));
......@@ -103,5 +103,38 @@ public void TestAggregation()
Assert.Equal("* GROUPBY 1 @brand REDUCE QUANTILE 2 @price 0.5 AS q50 REDUCE QUANTILE 2 @price 0.9 AS q90 REDUCE QUANTILE 2 @price 0.95 AS q95 REDUCE AVG 1 @price REDUCE COUNT 0 AS count LIMIT 0 10 SORTBY 2 @count DESC",
GetArgsString(r));
}
[Fact]
public void TestAggregationBuilder()
{
Assert.Equal("*", new AggregationBuilder().GetArgsString());
AggregationBuilder r1 = new AggregationBuilder()
.GroupBy("@actor", Count().As("cnt"))
.SortBy(Descending("@cnt"));
Assert.Equal("* GROUPBY 1 @actor REDUCE COUNT 0 AS cnt SORTBY 2 @cnt DESC", r1.GetArgsString());
Group group = new Group("@brand")
.Reduce(Quantile("@price", 0.50).As("q50"))
.Reduce(Quantile("@price", 0.90).As("q90"))
.Reduce(Quantile("@price", 0.95).As("q95"))
.Reduce(Avg("@price"))
.Reduce(Count().As("count"));
AggregationBuilder r2 = new AggregationBuilder()
.GroupBy(group)
.Limit(10)
.SortByDescending("@count");
Assert.Equal("* GROUPBY 1 @brand REDUCE QUANTILE 2 @price 0.5 AS q50 REDUCE QUANTILE 2 @price 0.9 AS q90 REDUCE QUANTILE 2 @price 0.95 AS q95 REDUCE AVG 1 @price REDUCE COUNT 0 AS count LIMIT 0 10 SORTBY 2 @count DESC",
r2.GetArgsString());
AggregationBuilder r3 = new AggregationBuilder()
.Load("@count")
.Apply("@count%1000", "thousands")
.SortBy(Descending("@count"))
.Limit(0, 2);
Assert.Equal("* LOAD 1 @count APPLY @count%1000 AS thousands SORTBY 2 @count DESC LIMIT 0 2", r3.GetArgsString());
}
}
}
......@@ -112,6 +112,16 @@ public void LimitFields()
Assert.Equal(2, query._fields.Length);
}
[Fact]
public void ReturnFields()
{
var query = GetQuery();
Assert.Null(query._returnFields);
Assert.Same(query, query.ReturnFields("foo", "bar"));
Assert.Equal(2, query._returnFields.Length);
}
[Fact]
public void HighlightFields()
{
......
Markdown is supported
0% or
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment