Posted on:
Categories:
Description:
As I mentioned in my previous post Executing PowerShell in a SharePoint 2013 Timer Job - Part 1 there were issues upgrading some SharePoint 2010 Timer Jobs to SharePoint 2013. These were due to legacy CAS (Code Access Security) support being enabled for the SharePoint Timer Service. This post will describe a workaround that I created so the SharePoint 2013 Timer Jobs will be able to execute PowerShell as well leave the legacy CAS support enabled to ensure that SharePoint 2010 mode sites and solutions will still continue to work on the farm.To do this I decided to break my timer job solution into 3 components SP2013TimerJobWorkaround.SP A SharePoint solution for deploying the timer jobs SP2013TimerJobWorkaround.Console A console application that executes the business logic for the timer job SP2013TimerJobWorkaround.Common A class library to contain the majority of the code used so both the console and SharePoint projects can share code To demonstrate the issues with the legacy CAS support I have added two timer jobs to the solution “SP2013TimerJobWorkaround.SP Failing Timer Job” and “SP2013TimerJobWorkaround.SP Working Timer Job”. Now for some quick notes on the solution’s configuration The SP2013TimerJobWorkaround.Common project Added references to both Microsoft.SharePoint (GAC) and System.Management.Automation (\Program Files (x86)\Reference Assemblies\Microsoft\WindowsPowerShell\3.0\System.Management.Automation.dll) The SP2013TimerJobWorkaround.Console project Added references to both Microsoft.SharePoint (GAC) and SP2013TimerJobWorkaround.Common Added a post-build event (more on this in a bit) Unchecked ‘Prefer 32-bit’ for both debug and release configurations (build tab of the project properties) The SP2013TimerJobWorkaround.SP project Added a reference to SP2013TimerJobWorkaround.Common Created a SharePoint mapped folder at ISAPI\SP2013TimerJobWorkaround.SP Now to get my console application packaged and deployed with my SharePoint solution I did the following Configured the project build order to ensure the console application builds before the SharePoint solution SP2013TimerJobWorkaround.Common SP2013TimerJobWorkaround.Console SP2013TimerJobWorkaround.SP Built the solution one time so I can link the files properly Added the console application to the SP2013TimerJobWorkaround.SP project Right click the SP2013TimerJobWorkaround.SP subfolder under the mapped ISAPI folder Add existing items Browsed to ‘\SP2013TimerJobWorkaround.Console\bin\Debug’ and added both SP2013TimerJobWorkaround.Console.exe and SP2013TimerJobWorkaround.Console.exe.config files to the project Created a post-build event in the SP2013TimerJobWorkaround.Console project to copy the latest build of the console application over Right click the SP2013TimerJobWorkaround.Console project and select properties Selected the build events tab and added the following xcopy "$(TargetDir)*.*" "$(SolutionDir)SP2013TimerJobWorkaround.SP\ISAPI\SP2013TimerJobWorkaround.SP\*.*" /Y Note This will copy all the files from the bin\debug or bin\release folder into the SP2013TimerJobWorkaround.SP\ISAPI\SP2013TimerJobWorkaround.SP folder of the SharePoint project, but on the exe and the config file will be packaged into the WSP. So now that the project is all set up we can look at some of the code. The PowerShell commands that I’m executing are located in the SP2013TimerJobWorkaround.Common class library in the PowerShellHelper classpublic static void ExecPowerShellCmdlets() try //define some PowerShell to run (this currently writes out all the site collection sizes) const string command = @"foreach($site in Get-SPSite -Limit All) Echo ""Site $($site.Url) - Storage $($site.Usage.Storage) bytes"" "; var runspace = RunspaceFactory.CreateRunspace(InitialSessionState.CreateDefault()); runspace.Open(); using (var powerShellCommand = PowerShell.Create()) powerShellCommand.Runspace = runspace; //Add the SharePoint SnapIn powerShellCommand.AddScript("Add-PsSnapin Microsoft.SharePoint.PowerShell"); //add the defined PowerShell powerShellCommand.AddScript(command); //Write out the results to the ULS logs (if anything comes back) foreach (var result in powerShellCommand.Invoke()) ErrorLogging.WriteUlsEntry(result, TraceSeverity.High, EventSeverity.Information, Definitions.UlsCategory); runspace.Close(); catch (Exception ex) ErrorLogging.WriteUlsEntry(ex, TraceSeverity.Unexpected, EventSeverity.ErrorCritical, Definitions.UlsCategory); This method will run the PowerShell commands defined and then write the output to the ULS logs for us. If we look now at the console application all it does is call this method (with a few extra log entries to trace the execution better)static void Main(string[] args) try ErrorLogging.WriteUlsEntry("------ Start Console App ------", TraceSeverity.High, EventSeverity.Information, Definitions.UlsCategory); //Run the predefined PowerShell commands PowerShellHelper.ExecPowerShellCmdlets(); ErrorLogging.WriteUlsEntry("------ End Console App ------", TraceSeverity.High, EventSeverity.Information, Definitions.UlsCategory); catch (Exception ex) ErrorLogging.WriteUlsEntry(ex, TraceSeverity.Unexpected, EventSeverity.ErrorCritical, Definitions.UlsCategory); Now for the SharePoint Timer Jobs. In the SP2013TimerJobWorkaround.SP project there are 2 timer job classes FailingTimerJob and WorkingTimerJob. The FailingTimerJob runs the following codepublic override void Execute(Guid contentDbId) ErrorLogging.WriteUlsEntry("------ Start Failing Timer Job ------", TraceSeverity.High, EventSeverity.Information, Definitions.UlsCategory); try //Run the PowerShell cmd PowerShellHelper.ExecPowerShellCmdlets(); catch (Exception ex) ErrorLogging.WriteUlsEntry(ex, TraceSeverity.Unexpected, EventSeverity.ErrorCritical, Definitions.UlsCategory); ErrorLogging.WriteUlsEntry("------ End Failing Timer Job ------", TraceSeverity.High, EventSeverity.Information, Definitions.UlsCategory); It's pretty straight forward but as you can see it doesn't run due to the legacy CAS support being enabled for the SharePoint Timer Service. Now for the WorkingTimerJob, it runs this codepublic override void Execute(Guid contentDbId) ErrorLogging.WriteUlsEntry("------ Start Working Timer Job ------", TraceSeverity.High, EventSeverity.Information, Definitions.UlsCategory); try //Run the console application var consoleApp = SPUtility.GetVersionedGenericSetupPath(Definitions.ConsoleApplicationPath, SPUtility.CompatibilityLevel15); var process = new Process StartInfo = new ProcessStartInfo FileName = consoleApp ; process.Start(); process.WaitForExit(); catch (Exception ex) ErrorLogging.WriteUlsEntry(ex, TraceSeverity.Unexpected, EventSeverity.ErrorCritical, Definitions.UlsCategory); ErrorLogging.WriteUlsEntry("------ End Working Timer Job ------", TraceSeverity.High, EventSeverity.Information, Definitions.UlsCategory); The difference here is that the working timer job is calling the console application which runs the PowerShell for it, instead of directly trying to call the PowerShell commands. As you can see it executes as expected and writes out all the site collection sizes to the ULS logs I've included my sample project here so you can look at the source code in a bit more detail. Hope this helps you out and provides you with another option when running into issues with SharePoint 2013 and legacy CAS support.




Posted on:
Categories: SharePoint;PowerShell
Description:
​While upgrading some timer job solutions from SharePoint 2010 to SharePoint 2013 I ran into a frustrating problem. For some reason some of my fairly basic code just wouldn’t run. Instead I was getting the following error System.TypeInitializationException The type initializer for 'System.Management.Automation.SessionStateScope' threw an exception. ---> System.InvalidOperationException Dynamic operations can only be performed in homogenous AppDomain. This wasn’t really the most helpful error, but I traced it down to where my Timer Job was actually calling a couple PowerShell CmdLets.After doing some research I was able to isolate the issue down to .Net Legacy CAS support being enabled. By default in SharePoint 2013, all the web applications as well as the SharePoint Timer Service have .Net legacy CAS (Code Access Security) support enabled. I assume this is required so that SharePoint 2013 can fully support sites and solution running in SharePoint 2010 mode. The easiest solution is to go into the configuration file for the SharePoint Timer Service and to disable the legacy CAS support for the service. Note that you’d have to do this on every server in the farm that is running the Timer Service.To change the settings Navigate and open for edit the configuration file located in the SharePoint hive C\Program Files\Common Files\microsoft shared\Web Server Extensions\15\BIN\OWSTIMER.EXE.CONFIG Disable the support for Legacy CAS - change to the following <NetFx40_LegacySecurityPolicy enabled="false" /> Save the file and recycle the SharePoint Timer Service That should now allow your Timer Jobs to execute PowerShell without any issues. Some notes on this though From my research I’ve seen that disabling legacy CAS support is supported, but not recommended By disabling the legacy CAS support, it might break your farm’s ability to run sites fully in SharePoint 2010 mode as any SharePoint 2010 solutions deployed to the farm that contain Timer Jobs will probably not run anymore due to CAS issues If you are sure that your SharePoint 2013 farm won’t need to support sites or solutions in SharePoint 2010 mode that this should work for you. As for me, breaking SharePoint 2010 mode support wasn’t an option so I had to come up with a better alternative that I’ll discuss in Part 2.




Posted on:
Categories: PowerShell;SharePoint
Description: How to provision SharePoint structures using REST from PowerShell.
Lately it has become the norm for us to provision SharePoint structures using CSOM-oriented PowerShell scripts. This provides good agility because the provisioning can be run from pretty-much any machine against remote farms, both on-premise and in the cloud. There are plenty of PowerShell-CSOM examples posted around the web, but I'm starting to wonder whether we should rather take a REST-oriented approach, and I don't see much being said about that. PowerShell-CSOM, although liberating, is still dependent on the SharePoint Client runtime being installed and kept up-to-date locally. Just as there is a strong case for app development to shift to JQuery-and-REST, one can also make a case to shift our PowerShell provisioning scripts to REST. For one thing, doing so means our code will be more portable amongst technologies. We might choose, for example, to rip out some provisioning code from a PowerShell-REST script and transpose it into JQuery-REST so that users can self-provision things. We could, of course, transpose ​into any technology that has the HTTP capabilities. I'm not yet decided on whether relinquishing the CSOM encapsulation layer ​will prove good or bad, but I do plan to favour REST for the next while and see where it takes us. Here is a bare-bones pure REST example using PowerShell 3's Invoke-RestMethod. Some examples that use REST do so via SharePoint DLLs, which seems pointless. Let's abandon local SharePoint dependencies and go pure HTTP. The sample creates a view in the specified list. It was written for use in an on-premise enviroment.param ( [Parameter(Mandatory = $true)][string]$url, [Parameter(Mandatory = $true)][string]$listTitle ) $cred = Get-Credential # Get form digest (required for POST operations) $digestUrl = "0/_api/contextinfo" -f $url $response = Invoke-RestMethod -Uri $digestUrl -Method POST -Credential $cred $formDigestVal = $([XML] $response).GetContextWebInformation.FormDigestValue # Create the new view $dataUrl = "0/_api/lists/getByTitle('1')/Views" -f $url, $listTitle $headers = @ "X-RequestDigest" = $formDigestVal $contentTypeHeader = "application/json;odata=verbose" $body = @ __metadata = @ type = "SP​.View" ; Title = "A test view"; $jsonBody = ConvertTo-Json $body -Compress Invoke-RestMethod -Uri $dataUrl -Method Post -Credential $cred -Headers $headers -Body $jsonBody.ToString() -ContentType $contentTypeHeader MSDN​ has some good documentation on the properties and methods available via REST, including which properties​ are writeable.




Posted on:
Categories: Hyper-V;Storage
Description:
Recently, we've had the opportunity to help several clients migrate their legacy physical or virtual infrastructure environments to converged infrastructure that consolidates the compute (server) tier and the storage tier into a single integrated appliance. Nutanix is a market leader in converged infrastructure and has provided our client's with the ability to significantly reduce their physical infrastructure footprint and scale linearly. Many clients are seeing the value in modernizing their infrastructure and removing reliance on a SAN as well as seeing significant reductions in energy, rack space & power requirements. One client managed to reduce 22U of Rack space into a single 2U Appliance! The Nutanix differentiator resides in it's ability to combine standard enterprise grade hardware with it's proprietary tiered Storage Control technology allowing for linear scale-out and ground breaking storage performance all at a reasonable price point. Each Nutanix node has a combination of Solid State Flash Storage & Traditional Hard Drive drives. The Storage controllers pool together all of the storage available and present it as either NFS or SMB 3.0 and is Hypervisor agnostic. How does it perform? One of the key benefits of the Nutanix infrastructure is removing the reliance on an underlying storage network, allowing the flash tier to be placed as close as possible to the compute tier. We were recently migrating a client from Hardware that was approximately 5 years old From 3 Node Hyper-V 2008 R2 Cluster Mid-tier SAN with 28 Disks Capable of roughly 4000 IOPS To 3 Node Hyper-V 2012R2 Nutanix Cluster comprised of 6020 Model Nodes. ​ Performance was benchmarked using inside a Windows Server 2008 R2 test VM before and after migration and the performance increase was astounding. ​Tests performed using SQLIO - 25GB Test File, 8 Threads, 8 Outstanding requests, 120 second duration as follows ​sqlio -kW -t8 -s120 -o8 -frandom -b8 -BH -LS C\TestFile.dat sqlio -kR -t8 -s120 -o8 -frandom -b8 -BH -LS C\TestFile.dat sqlio -kW -t8 -s120 -o8 -fsequential -b64 -BH -LS C\TestFile.dat sqlio -kR -t8 -s120 -o8 -fsequential -b64 -BH -LS C\TestFile.dat TEST-VM on Old HardwareMBPSIOPSAvg Latency (ms)Random 8 KB Writes11.161429.544Random 8 KB Reads12.181559.140Sequential 64 KB Writes195.673130.8719Sequential 64 KB Reads247.453959.2515 TEST-VM on NutanixMBPS IOPSAvg Latency (ms)Random 8 KB Writes189.5224259.22Random 8 KB Reads733.8393930.450Sequential 64 KB Writes1392.5622281.112Sequential 64 KB Reads3792.7260683.590 Performance IncreaseLatency Decrease Random 8 KB Writes1698%2200% Random 8 KB Reads6025%4000% Sequential 64 KB Writes712%950% Sequential 64 KB Reads1533%1500% While in-guest performance testing is never fully representative of real-world workloads, running identical tests within the same Virtual Machine on different hardware definitely shows how much progress has been made in 5 years!




Posted on:
Categories: SharePoint
Description:
​A customer that I'm currently working with began to have issues with the 'Connect To Office' functionality that is offered in SharePoint after installing Lync 2013 on computers that also have Office 2010. Symptoms There were two different symptoms that we began to see. First, some users would click 'Connect to Office' and then 'Add to SharePoint Sites'. SharePoint would indicate that the site had been added, but a link would never appear in the 'SharePoint Sites' explorer shortcut Second, some users were completely missing the 'SharePoint Sites' shortcut in Windows explorer. Solution The solution for both issues required some edits and additions to the registry. The following registry key is crucial to the proper functioning of the 'Connect to Office' feature. The registry key is created as part of a normal Office installation. HKEY_CURRENT_USER\Software\AppDataLow\Microsoft\Office\15.0\Common\Portal Notice the reference to '15.0' in the registry key. This indicates a reference to Office 2013. This key would have been created after Lync 2013 was installed. However, we also have Office 2010 installed along with Lync 2013. Office 2010 would be designated by '14.0' (e.g. HKEY_CURRENT_USER\Software\AppDataLow\Microsoft\Office\14.0\Common\Portal). Since there is no reference to '14.0' in the registry key listed above, the 'Connect to Office' functionality will not work when you are using Excel 2010, Word 2010, PowerPoint 2010 etc.. Installing Lync 2013 seems to have removed(or changed) the version of the registry key that referenced '14.0'. I was able to fix this by manually creating the required keys. Create a new string value called 'PersonalSiteFallbackURL' in the following path HKEY_CURRENT_USER\Software\AppDataLow\Microsoft\Office\14.0\Common\Portal For the data value add the URL to your mysite host (e.g. http//mysite.contoso.com) *The LinkPublishingTimestamp key is created automatically when you start adding sites via 'Connect To Office' in SharePoint. I found that I did not have to delete the '15.0' registry hierarchy This fix addressed both of the symptoms indicated above. It allowed the 'SharePoint Sites' explorer shortcut to be created for people who were missing it, and also enabled sites that are added via the 'Connect to Office' option to appear in the 'SharePoint Sites' explorer shortcut. Note that you won't see the newly added sites until you open Word, Excel, PowerPoint etc. and select 'Save As'. By default, sites that are added via 'Connect to Office' will not be added to the 'SharePoint Sites' shortcut for 24 hours. You can bypass this limit by manually restarting the Web Client service on your computer. If the links still aren't populating then you can delete the LinkPublishingTimestamp registry key (located here HKEY_CURRENT_USER\Software\AppDataLow\Microsoft\Office\14.0\Common\Portal), which will trigger a refresh the next time that you open Excel, Word, PowerPoint etc. and select 'Save As'. Alternatively, you can just wait 24 hours and then the links will populate on their own.




Posted on:
Categories: SharePoint;PowerShell
Description:
​I was recently tasked with determining the group membership of several users across numerous different site collections. This was another perfect opportunity to use PowerShell. The following script can be scoped at the Web Application, Site Collection or Web and will give you all group memberships for the specified user in that scope.Function GetGroupMemberships ($User, $Url, $Scope) Switch ($Scope) "WebApplication" $WebApp = Get-SPWebApplication $URL foreach($SPSite in $WebApp.Sites) Write-Host -ForegroundColor Cyan "Checking" $SPSite.Url $userName = Get-SPUser -identity $User -web $SPSite.Url -ErrorAction silentlycontinue $groups = @() $groupMemberships = $userName.Groups |select Name foreach($group in $groupMemberships) $groups+=$group.Name foreach($SPWeb in $SPSite.allwebs) foreach($group in $SPWeb.Groups) foreach($groupName in $groups) if ($groupName -eq $group.Name) Write-Host $User "is a member of " -nonewline Write-Host -foregroundcolor Green $groupName "" -nonewline Write-Host "in " -nonewline Write-Host -foregroundcolor Yellow $SPWeb.URL "Site" $SPSite = Get-SPSite $Url $user = Get-SPUser -identity $User -web $Url $groups = @() $groupMemberships = $user.Groups |select Name foreach($group in $groupMemberships) $groups+=$group.Name foreach($SPWeb in $SPSite.allwebs) foreach($group in $SPWeb.Groups) foreach($groupName in $groups) if ($groupName -eq $group.Name) Write-Host $User "is a member of " -nonewline Write-Host -foregroundcolor Green $groupName "" -nonewline Write-Host "in " -nonewline Write-Host -foregroundcolor Yellow $SPWeb.URL "Web" $SPWeb = Get-SPWeb $Url $user = Get-SPUser -identity $User -web $Url $groups = @() $groupMemberships = $user.Groups |select Name foreach($group in $groupMemberships) $groups+=$group.Name foreach($group in $SPWeb.Groups) foreach($groupName in $groups) if ($groupName -eq $group.Name) Write-Host $User "is a member of " -nonewline Write-Host -foregroundcolor Green $groupName "" -nonewline Write-Host "in " -nonewline Write-Host -foregroundcolor Yellow $SPWeb.URL GetGroupMemberships -User "Domain\UserName" -Url "http//domain.com" -Scope "WebApplication" #GetGroupMemberships -User "Domain\UserName" -Url "http//domain.com" -Scope "Site" #GetGroupMemberships -User "Domain\UserName" -Url "http//domain.com" -Scope "Web"




Posted on:
Categories: SharePoint
Description:
I recently received a request from one of our customers to have a column that displays dates using a format that isn't supported using the standard 'Date' column type (e.g. August, 23 2013 as opposed to 8/23/2013). This is achievable by using the "Calculated (calculation based on other columns)" column type To set this up you need an existing column that uses the standard 'Date and Time' column type. In this example, we will assume that your column name is called 'Date'. Now, create a new column that uses the "Calculated (calculation based on other columns)" column type. For this example, the calculated column will be named 'Date Calculated'. In the 'Formula' section enter the following expression =TEXT([Date],"MMMM, DD YYYY") *'Date' refers to the name of your standard Date column The output of this command will be Month, day year (e.g. August 23, 2013)




Posted on:
Categories: SharePoint;PowerShell
Description:
After performing a content migration from a file share to SharePoint using a third party utility, we noticed that some folders had a modified date of December 29, 1900. This only affected folders, and there didn't appear to be any pattern determining which folders were affected. I wrote the following script to set the modified date for all affected folders to the current date and time​ $date = Get-Date $web = Get-SPWeb "SPWebURL" foreach($list in $web.lists) foreach($folder in $list.folders) if($folder["Modified"].ToString() -like "*1900*") $folder["Modified"] = $date $folder.Update()




Posted on:
Categories: Business;SharePoint
Description: It's great to see what others in the community are doing with their SharePoint implementations and to see what's working vs what's not, and in general to get ideas.
​It's great to see what others in the community are doing with their SharePoint implementations and to see what's working vs what's not, and in general to get ideas. Although not SharePoint specific, the yearly Nielsen Norman Group, 10 Best Intranets report is a great way to see what is trending, not only specific to functionality of intranets, but also what it takes to build a successful intranet. As always there were some interesting trends that presented themselves in this year's report. I wanted to start with the trending features of the 10 Best Intranets. These were just some of the features that appeared within these top intranets and seem to be gaining popularity amongst the intranet/site implementers of the world. Trending Features Scoped Search We sometimes treat search as the answer to all questions. However I've come across many cases where search was functioning just fine but users actively complained of still not being able to find useful content. This usually is due to the fact that the search results that are returned are too broad, and the relevant content is diluted amongst the irrelevant. This explains this trend which is to focus more on scoped searches that are presented in a usable manner. This means rather than searching your whole intranet for a term, the user actively selects what part of the site they want to search, resulting in less irrelevant and more relevant content. Carousels Anyone that has asked the broad question of "what should appear on your home page?" should know of the political battles that sometimes ensue. The home page is your prime real estate, the waterfront view of your intranet/site. Carousels were an overwhelming trend amongst the 10 Best Intranets, with 8 out of 10 of them using this functionality. This is a great method of displaying more content on the page but at the same time presenting it in a digestible manner with categories and limited scrolling (hopefully). Megamenus This has been a steady growing trend in the last few years, but if not done correctly can contribute more to the problem than the solution. Many of this year's sites utilized megamenus with usability testing to ensure that end users were not further confused or bombarded with choices. Persistent Right Content Column By ensuring the right most column is always present on inner pages with relevant content such as related links and contact information, you provide an enhanced navigation experience. The key is to ensure that this content is presented in a simple manner, with important and relevant information. Trending Statistics Faster Development Time The average number of years it took to develop this year's intranets dropped nearly in half from last year to 1.4 years. With an increase of Agile development process use, teams developed their sites faster and engaged their end users earlier. Growing Team Sizes Although down from last year, the overall trend is still growing with the average team size at 16 members. More and more people finding that in order to implement a successful intranet/site, a lot of different roles must be filled. From the Business Analysts leading the planning, Developers and Designers creating the site, to the Trainers introducing the site to the end users, the team that delivers a successful intranet/site is growing. Responsive Design Unlike last year, almost all of the winners accommodated mobile support in one fashion or another. Responsive design was a focus for 3 out of the 10 winners, who chose to create a single version of the site that would adapt to all platforms. Although every implementation is different, the desire is the same. We all want to build intranets that are highly adopted and well received. These trending features and statistics as identified in the 2014 Intranet Design Annual, provides some great insight into what successful teams are doing. The full report can be found here.




Posted on:
Categories: SharePoint
Description: A quick way to determine all usages of a term across your SharePoint environment.
When maintaining a SharePoint term set, you may need to know where a particular term has been used, and to cover all possible taxonomy fields. This typically happens when someone wishes to deprecate or even delete a term but is unsure of how it is presently being used. Here's a quick way to check farm-wide, u​​sing search. Note that, since search results are security-trimmed, this is only definitive if run under an account that has read access to all content, so you may want to ask a suitably-authorised​ person to run the query.A No-code Approach At your organization's main search centre, enter the following in the search box, where the 2nd half is the ID of the term. ​owstaxidmetadataalltagsinfo04b307b57-5917-4e46-b969-ba79dd40f43f Note the prepended zero after the colon. This will return all objects tagged with that term in any field. ​More Control with Powershell and CSOM ​The following script, which assumes SP 2013, does the same thing but means you're not at the mercy of the settings in your search centre. It also serves as a general example of how to execute search queries using CSOM from PowerShell 3+.param( [Parameter(Mandatory = $true)][string]$url, [Parameter(Mandatory = $true)][guid]$termId ) [System.Reflection.Assembly]LoadWithPartialName("Microsoft.SharePoint.Client") [System.Reflection.Assembly]LoadWithPartialName("Microsoft.SharePoint.Client.Runtime") [System.Reflection.Assembly]LoadWithPartialName("Microsoft.SharePoint.Client.Search") $ctx = New-Object Microsoft.SharePoint.Client.ClientContext($url) $ctx.Credentials = Get-Credential $keywordQuery = New-Object Microsoft.SharePoint.Client.Search.Query.KeywordQuery($ctx); $keywordQuery.QueryText = "owstaxidmetadataalltagsinfo00" -f $termId $keywordQuery.SourceId = [guid]"8413cd39-2156-4e00-b54d-11efd9abdb89" # Local SharePoint Results $searchExecutor = New-Object Microsoft.SharePoint.Client.Search.Query.SearchExecutor($ctx); $resultTables = $searchExecutor.ExecuteQuery($keywordQuery); $ctx.ExecuteQuery() foreach($resultTable in $resultTables) foreach($resultRow in $resultTable.Value.ResultRows) Write-Output "$($resultRow.Title) $($resultRow.Path)" ​




Posted on:
Categories: SharePoint
Description:
​I was recently tasked with addressing a series of errors that had been occurring very frequently on one of our SharePoint servers. Description SQL Database login for 'SPFarm_Config' on Instance 'SP2010SQL' failed. Additional error Information from SQL Server is included below. Login failed for user 'Domain\ServerName' Event ID 3351 By using SQL Profiler and the timestamp from the event log item we were able to identify the process ID(PID) of the process that was generating the error. Now that I had the PID I was able to open Process Monitor on the affected server and identify the offending process. It turned out that the offending process was PowerShell.exe. Drilling into the event properties I was able to see the exact command that was being run "C\Windows\System32\WindowsPowerShell\v1.0\powershell.exe" -ExecutionPolicy RemoteSigned -Command "&.\FASTContentPluginDiscovery.ps1 'FBBF9CEA-9F7D-BFA0-613A-F7940C05FC3C' '26B375A8-C8CF-EEB8-780B-0651251A2FA7' 'serverName' 'Content Plugin (serverName)'" Running an online search for FASTContentPluginDiscovery.ps1 allowed me to identify SCOM (System Center Operations Manager) as the application that was invoking PowerShell and executing the aforementioned command. I contacted the SCOM team and alerted them about the errors that were being generated by the PowerShell command. The SCOM team was able to resolve the issue by removing the affected server from the SCOM class that the FAST Content Plugin discovery was targeted to. The server under consideration does not run FAST, so it made sense to remove it from the SCOM class.