Helllo.
I'm running DNN 9.2.1 in an Azure WebApp setup with 2 instances. I've recently started using the DNN search to index my websites content and my logs have exploded with errors (show at the bottom). On my usual configuration, with VMs I can set the crawler task to run on one server only. However, using WebApps that's not possible because the server name dynamic and can change whenever.
Anyone has any tips to avoid having gigantic logs (50MB+) filled with errors related to this? Do I have to resort to indexing like once per day?
Thanks!
ERROR:
DotNetNuke.Services.Exceptions.SearchException: Unable to create Lucene writer (lock file is in use). Please recycle AppPool in IIS to release lock. ---> System.IO.IOException: The process cannot access the file 'D:\home\site\wwwroot\App_Data\Search\write.lock' because it is being used by another process.
at System.IO.__Error.WinIOError(Int32 errorCode, String maybeFullPath)
at System.IO.File.InternalDelete(String path, Boolean checkHost)
at System.IO.File.Delete(String path)
at DotNetNuke.Services.Search.Internals.LuceneControllerImpl.get_Writer()
--- End of inner exception stack trace ---
at DotNetNuke.Services.Search.Internals.LuceneControllerImpl.get_Writer()
at DotNetNuke.Services.Search.Internals.LuceneControllerImpl.Delete(Query query)
at DotNetNuke.Services.Search.Internals.InternalSearchControllerImpl.DeleteSearchDocumentInternal(SearchDocument searchDocument, Boolean autoCommit)
at DotNetNuke.Services.Search.Internals.InternalSearchControllerImpl.AddSearchDocumentInternal(SearchDocument searchDocument, Boolean autoCommit)
at DotNetNuke.Services.Search.Internals.InternalSearchControllerImpl.AddSearchDocuments(IEnumerable`1 searchDocuments)