Posted on:
Categories: Business;Azure
​Azure Log Analytics, part of the Operations Management Suite, is Microsoft's SaaS based solution to deliver IT Insights. When I'm demonstrating the solution to clients, I like to tell them it's "Business Intelligence Dashboard for their servers" which resonates well. The essence of the solution is you install an agent onto your VMs (which can live on premises, in Azure or AWS). The agent will feed all of your logs into the Azure Log Analytics service and Microsoft will leverage its big data & machine learning platforms combined with intelligence packs to provide actionable recommendations.​ There are many intelligence packs for different insights, including Active Directory, Change Tracking, System Update, Security & Audit, Malware Assessment, Network Monitoring, SQL Assessment and many others. The best way to show the power of the automated SQL Assessment is to actually see it in action ​Immediately you can see all of the recommendations Microsoft is making for your SQL environment, taking the log data and automatically comparing it against over 14,000 knowledge base articles. If you drill into any of these recommendations, Microsoft will provide step by step instructions on how to resolve the issue, which is fantastic for organizations without full time DBAs or in depth SQL knowledge. This data will be constantly updated, so you always have the most recent assessment at your fingertips. So how much does this all cost? As with most things in IT the answers is… it depends. Microsoft charges $2.30 per GB of Data processed under the standard tier, which will retain data for a month and $3.50 per GB of Data processed which will retain data for a year. There's also a FREE tier, which will retain data only for 7 days and give you an idea of how much total data is being ingested to be able to model future costs. All of the configuration can be done within a single day, leading to very rapid return on value! If you have any questions or would like to see a live demo, I'd encourage you to drop me a line at Cheers, Eric Fontaine | Senior Cloud Solution Architect

Posted on:
Categories: SharePoint;PowerShell
Description: We encountered an issue in which certain document types (ie. .doc, .xls, .msg) uploaded to a document library would be set as the Folder content type and not the custom content type added to the document library.
Scenario​We created a PowerShell script to provision a site with custom columns, content types, lists and libraries. The Document content type was removed from the document libraries and a custom content type was added to them. When testing the newly provisioned site we encountered an issue in which certain document types (ie. .doc, .xls, .msg) uploaded to these document libraries would be missing the expected metadata fields to fill in. Further investigation revealed that these documents were being set as the Folder content type and not the custom content type we added to the document library. The CauseIn our PowerShell script, we were deleting the document library's default Document content type before adding the new content type. When you do this in that order, it causes some document file types to be set as Folder.This behaviour is applicable to both SharePoint 2010 and SharePoint 2013.SolutionIn our case, we updated our site provisioning script to add the new content type first before deleting the Document content type from the document library.If your document libraries are already in use, Steven Van de Craen has outlined two solutions in his blogSolution 1 Re-add the Document content type to the library and delete it againSolution 2 Set the default content type by updating the list's content type order Here is a script to set the default content type of a list in PowerShell $web = Get-SPWeb "http//siteURL" $list = $web.Lists["MyDocumentLibrary"] $contentTypes = $list.ContentTypes $result = New-Object System.Collections.Generic.List[Microsoft.SharePoint.SPContentType] foreach ($ct in $contentTypes) if ($ct.Name -eq "MyContentType") $result.Add($ct) $list.RootFolder.UniqueContentTypeOrder = $result $list.RootFolder.Update() To avoid this issue in the future, just remember to add new content types first before deleting the default Document content type from a document library. Cheers!ReferencesSteven Vande Craen's blog Dot Net Spark code snippet