DNN Community Blog

The Community Blog is a personal opinion of community members and by no means the official standpoint of DNN Corp or DNN Platform. This is a place to express personal thoughts about DNNPlatform, the community and its ecosystem. Do you have useful information that you would like to share with the DNN Community in a featured article or blog? If so, please contact .

The use of the Community Blog is covered by our Community Blog Guidelines - please read before commenting or posting.

Using Cloud Storage for Dnn Images, Users and other Folders

Separation of application and storage is always a good thing.  When we build a website (or any software, for that matter) it’s best to separate out the storage of data used by the site with the application code used to deliver the site itself.  This gives you the possibility of being confident in upgrading application code without risking data.  In the case of websites, it can also give you better performance because browsers are able to open multiple connections to multiple domains at once.  Both Microsoft Windows and Apple OSX have different locations for storage (Documents, images, etc.) than for applications (Program Files, Applications).   This makes it easy to backup just your data and even to move data between copies.   So far, so good.

Since the very beginning Dnn has been built with the assumption of a single storage space for the application and the files belonging to that application.  Originally, this was a good idea as not many people would have adopted the platform had you required a separate server to deliver images and documents to the web server itself.   Not many people had two servers lying around, or even had a SAN to deliver files.   The rise of public cloud services has changed this scenario, and now it is trivially easy – and cheap – to store your application files separate to your application code.

Enter the Separation of Storage and Application

A change was introduced to Dnn Platform in version 7.3, which was the ability to create a new install/site with Dnn and create the contents of the /Portals/ path created on separate file storage.   This is achieved through the use of Folder Providers, which were originally introduced in Dnn 6.  The change is pretty simple – when the install is run, it looks for a directive to create the portal folders either within the application (the default scenario) or using another Folder Provider.   Note that I have talked about Cloud providers here, but it’s not restricted to that – you can use any Folder Provider, and you can build your own Folder Providers.  So that includes UNC based folder providers and anything else you can come up with.

Once the install is finished, you’ll see that the installed folders are created with the specified Folder Provider (usually represented with a different icon).

This means that any data uploaded to the Users or Images folder in this site will be uploaded directly to the Azure Storage account set up for this site.   This is including any new Users that create a profile and upload content for that profile.  Of course, you can still create different types of provider-based folders now the site is created.  This change is to allow folders in the install template to be created in your provider of choice at the time of installation.


How to Create a new Dnn Installation using an Azure Storage Account

Note: this example uses Evoq Content 8.3 as the install package, but the same process works for Dnn Platform in all versions since 7.3.  However, you do have to have a Folder Provider installed.  Evoq Content comes with both Azure and Amazon S3 folder providers built in, Dnn Platform doesn’t come with any standard Folder Providers. This example shows the use of an Azure storage account and Azure Folder Provider. To follow this example with Dnn Platform, you’d need to purchase or build your own Folder Provider and include that in the list of modules installed by default.  You can find different examples in the Dnn Store.  There are also open source examples available.

Step 1 : Open a new storage account in Azure through the Azure Portal, using the + New button.   Provide a name for the storage account and choose the Azure Region and subscription.  In this example I have chosen Geo-Redundant storage, which ensures that the data is written to two different locations in the region.

Step 2 : Prepare the new installation by unzipping the install package, and locate the DotNetNuke.install.config.resources file.  Open the file in a text editor.

Add in the following section, after the last </portals> closing tag:

		<folderType name="AzureFolderType">
, DotNetNuke.Professional.FolderProviders, Version=, Culture=neutral, PublicKeyToken=null</businessClassQualifiedName>			
				<setting encrypt="true" name="AccountName">{accountname}</setting>
				<setting encrypt="true" name="AccountKey">{accountkey}</setting>
				<setting name="Container">portals</setting>
				<setting name="UseHttps">True</setting>
				<setting name="DirectLink">True</setting>
				<setting name="DefaultMappedPath">{PortalId}/</setting>

Note that this example uses the DotNetNuke.Professional.FolderProviders.AzureFolderProvider business class definition.  If you are not using Evoq Content, you’ll need to substitute in the value for the Folder Provider you’re intending to use.  This should be easily accessible by looking at an install web.config file with the same Provider installed, or just ask the developer who made the provider.

The folderMappings section specifies which folders in the install template are going to be created on the specified Folder Provider.  In this case, you can see that Images, Documents and Users are to be created in Azure.  You can modify this as needed.

Step 3 : Add in storage account credentials

Open the storage account and copy the storage account key from the azure portal, and paste it into the DotNetNuke.Install.config.resources file, where I have added {accountkey} in the above example. It doesn’t matter which key you use.  I tend to use Key1 for ‘fixed’ values like this, and reserve Key2 for giving out to others, if needed, because you can then easily rescind that key.   You must also substitute the account name in the {accountname} value in the xml.

Step 4 : Create the Container in the Azure Storage Account

A Storage account needs a Container before you can save any files in it.   As per the ‘Container’ setting in the above example, I have specified a container called ‘portals’.  This means there needs to be a container called ‘Portals’ in the Azure Storage account.

You can create a Container through the Azure portal (click on the ‘Blobs’ in the Storage Account), but I have used Azure Storage Explorer (a free tool) to create the container:

This container is created as a public access container, which is suitable for the public-access images I intend to store in it.

Step 5 : Run the Installer

For this step, run the Dnn install as you would any normal Dnn installation.   As the install completes, you will see the files from the Site Template you use fill up the container in the storage account:

When the install is complete, log onto the File Manager and you’ll see that the folders are created using the Azure Folder Provider.

Looking at the Install Files

If you take a look at the Dnn install, you’ll notice a couple of things which are slightly different to a standard Dnn install.

1.  There will be a copy of the portals/0 folder on the application folders as well as the Azure container.

This is an artefact of the installer.  Because the content in the site template is usually hard-coded to a local path, the files are copied there.   You can ignore the files and delete them after you delete the default content.  From now on all your files will upload to the Azure storage container.

2. Your file system has portals/0-system

In fact, this is present in all installs from 7.3 onwards, whether you use Folder Providers or not.  This is actually used to separate out the Cache folder (which needs to be local for performance reasons) from cloud storage folders.

3. You have a new file called ‘DotNetNuke.folderMappings.config’ in the root of the application install.

This is a copy of the folderMappings section from the DotNetNuke.install.config.resources files.  Because the install file is ignored after the installation is complete, future sites added to the Dnn installation wouldn’t know what storage account to use.  If you add a new Site to this install, the Folder Provider configuration is read from this file during the new site creation process.   This also means you can modify the file between site creation to change the Folder Provider details used for new sites.

Using with Dnn

From this point onwards, there is no difference to using Dnn with the Folder Provider.   For an example, I used the Evoq Content Publisher functionality to publish an article.  I selected the image for the article and uploaded it to the images path.  This uploaded the file to Azure, and when the file is inserted into the Article, it uses the URL as generated by the Azure folder provider.

Using a CDN to deliver content

Azure Storage providers can easily be used to deliver the stored content over an Azure Content Delivery Network.  This pushes the data closer to the application visitor, so that they receive a faster response and are able to lower bandwidth costs in some case. 

To create a CDN in Azure, add a new CDN using the +New button, and give it a name and the home Azure Region.

Then associate a CDN endpoint with the Storage Account.

When creating the endpoint, associate it with the storage account, then set the origin path to the /portals container.  

From then, the files will be accessible in the format of http(s)://<yourcdnname>.azureedge.net/0/images/your-file.jpg, where your-file.jpg is a file you uploaded to the Images folder in Dnn.  Note the /portals path is not in the CDN path.

At this point, the Azure Folder Provider doesn’t automatically pick up the CDN URL, but you can easily insert images by URL, as shown in this example where I create a new article in the Evoq Publisher, and insert the image by URL.  Note the CDN endpoint URL serving the same image already uploaded via Dnn.

The return of the URL is the responsibility of the Folder Provider so any Provider with any storage platform can be made to return the CDN URL.  


The use of Folder Providers at the installation level is very useful when you want to separate out the delivery of content from the application itself.  This is particularly important when it comes to storing extremely large amounts of content – storage on web servers is exceedingly expensive when compared to cloud storage.   In the case of some cloud platforms, the total amount of storage you can get with the application is capped at a hard limit.  It also allows for separate backup strategies for content versus application data.   Finally, endless cloud accounts means that you’ll never run out of space, no matter how many funny cat videos your site users upload.   You can also take advantage of built-in CDN functionality in some Cloud storage providers.

The drawback is that this can only be achieved with new installations.    It also takes some configuration knowledge of the install process and configuring Folder providers – so I don’t recommend for the first-time installers.  However, if you regularly create Dnn installations then I suggest you give it a .


Chris Csanyi
Bruce, Great article and something I am wanting to look at in more detail. Just a quick question can you get something like this going with an existing folder structure with thousands of documents with some type of a redirect setup in the webconfig? Just not sure what is possible and of course need to start digging in to see what the options are.
Chris Csanyi Thursday, February 25, 2016 2:19 PM (link)
Mitchel Sellers
Thank you so much for sharing this one!
Mitchel Sellers Thursday, February 25, 2016 4:34 PM (link)
Will Strohl
Who's going to #participate and do a pull request to put this into the documentation center? There's community points involved if you do...
Will Strohl Thursday, February 25, 2016 7:02 PM (link)
Daniel Mettler
Thanks bruce.

I have a question - because I remember trying this functionality in early days and dismissing it - so I hope I was wrong: When a file is on an external storage, how does it get delivered to the browser-client?
1. Does the in-html-link already point to the CDN
2. does the in-html link point to the linkclick.aspx, which then redirects to the CDN
3. Does the in-page html link to the "fake" dnn-location, which then redirects to theCDN
4. Does the in-page html link to the fake dnn-location, which then downloads and streams the file to the user?

IMHO the correct implementation should be #1, but I remember it being one of the others. From what I remember, the link-resolution to a file#235 would return the link-click address or something. What's the status now? Any live website I could browse to which uses this kind of setup so I could experience it as a user?

Daniel Mettler Friday, February 26, 2016 3:33 AM (link)
Bruce Chapman
@Daniel - to be clear, the only thing new I am introducing here is the ability to create these for the 'built in' (templated) folders at installation time. Of course Folder Providers have been around for a long time (http://www.dnnsoftware.com/wiki/folder-providers)

Now, to answer your question - it depends. It depends entirely on the Folder Provider installation and configuration.

The example I showed here using the Azure Folder Provider with a public container and public folder in DNN generates the URL direct to the file in the Azure Storage Container. You can see that in first picture on the page with the Kangaroo in it - the red arrow points to the URL used for the image. You can see that URL is direct to the Azure blob URL. So that is your scenario #1 (except it's not a CDN, it's just the storage container endpoint, which isn't a real CDN).

However, if you create a container (using the Folder providers) which is a private container, and then create folder so that it is restricted access for users in the specific role, then it will use a LinkClick style URL (again, it depends on the Folder Provider implementation). In that case, the link click handler will open the remote resource and stream it down to the user, using a URL which is obviously on the same domain as the website. In this case you're opening the remote resource and also using your own web server. So this will be slower, but it is secure in the same way locating the file in a 'secure' folder is secure.

If you're authoring a Folder Provider, then you can use whatever way you want - you've just got to provide an implementation of the Folder Provider that matches what DNN expects. As I say in the article, this could include using a CDN pattern through configuration if you wanted.
Bruce Chapman Friday, February 26, 2016 5:31 AM (link)

Comment Form

Only registered users may post comments.


Aderson Oliveira (15)
Alec Whittington (11)
Alex Shirley (10)
Andrew Nurse (30)
Andy Tryba (1)
Anthony Glenwright (5)
Antonio Chagoury (28)
Ash Prasad (32)
Ben Schmidt (1)
Benjamin Hermann (25)
Benoit Sarton (9)
Beth Firebaugh (12)
Bill Walker (36)
Bob Kruger (5)
Bogdan Litescu (1)
Brian Dukes (2)
Brice Snow (1)
Bruce Chapman (20)
Bryan Andrews (1)
cathal connolly (55)
Charles Nurse (163)
Chris Hammond (209)
Chris Paterra (55)
Clinton Patterson (41)
Cuong Dang (21)
Daniel Bartholomew (2)
Daniel Mettler (154)
Dave Buckner (2)
David Poindexter (4)
David Rodriguez (3)
Dennis Shiao (1)
Doug Howell (11)
Erik van Ballegoij (30)
Ernst Peter Tamminga (74)
Geoff Barlow (10)
George Alatrash (6)
Gifford Watkins (3)
Gilles Le Pigocher (3)
Ian Robinson (7)
Israel Martinez (17)
Jan Blomquist (2)
Jan Jonas (3)
Jaspreet Bhatia (1)
Jenni Merrifield (6)
Joe Brinkman (274)
John Mitchell (1)
Jon Henning (14)
Jonathan Sheely (4)
Jordan Coopersmith (1)
Joseph Craig (2)
Kan Ma (1)
Keivan Beigi (3)
Ken Grierson (10)
Kevin Schreiner (6)
Leigh Pointer (31)
Lorraine Young (60)
Malik Khan (1)
Matthias Schlomann (15)
Mauricio Márquez (5)
Michael Doxsey (7)
Michael Tobisch (3)
Michael Washington (202)
Miguel Gatmaytan (3)
Mike Horton (19)
Mitchel Sellers (28)
Nathan Rover (3)
Navin V Nagiah (14)
Néstor Sánchez (31)
Nik Kalyani (14)
Peter Donker (54)
Philip Beadle (135)
Philipp Becker (4)
Richard Dumas (22)
Robert J Collins (5)
Roger Selwyn (8)
Ruben Lopez (1)
Ryan Martinez (1)
Salar Golestanian (4)
Sanjay Mehrotra (9)
Scott McCulloch (1)
Scott S (11)
Scott Wilkinson (3)
Scott Willhite (97)
Sebastian Leupold (80)
Shaun Walker (237)
Shawn Mehaffie (17)
Stefan Cullmann (12)
Stefan Kamphuis (12)
Steve Fabian (31)
Timo Breumelhof (24)
Tony Henrich (3)
Torsten Weggen (2)
Vicenç Masanas (27)
Vincent Nguyen (3)
Vitaly Kozadayev (6)
Will Morgenweck (40)
Will Strohl (165)
William Severance (5)

Content Layout

Subscribe to DNN Digest

Subscribe to DNN Digest

DNN Digest is our monthly email newsletter. It highlights news and content from around the DNN ecosystem, such as new modules and themes, messages from leadership, blog posts and notable tweets. Keep your finger on the pulse of the ecosystem by subscribing.  

Copyright 2017 by DNN Corp Terms of Use Privacy
What is Liquid Content?
Find Out
What is Liquid Content?
Find Out
What is Liquid Content?
Find Out