Posted on:
Categories: SharePoint
Description:
By default SharePoint is configured to remove the workflow history from an item 60 days after the completion date of a given workflow. This might not be an issue under normal circumstances; however, we started to receive reports from some business groups indicating that they needed the workflow history for legal auditing purposes. I was pleased to discover that SharePoint doesn't actually delete the workflow history for a given item, it just unlinks the item from the workflow history. At this point I began searching for options that might enable me to reconnect each item to its respective workflow history. The solution came in the form of a calculated column. The strategy would be to create a new column that would contain a link to the relevant workflow history for each item. The workflow history list provided me with everything that I needed to uniquely identify each item and its workflow history. Here is the query that I ended up using to create a link back to each items' workflow history =CONCATENATE("<a href='https//site.contoso.com/sites/sitename/Lists/Workflow History/Audit View.aspx?View=7B4903D7C0%2DE871%2D462A%2D9D60%2D89A7E8B68901&FilterField1=Item&FilterValue1="&ID&"&FilterField2=List&FilterValue2=3a7dd990-fc6c-459b-a341-ac0bc7ab46a8&FilterField3=WorkflowAssociation&FilterValue3= 0ca99fcb-3e3b-4ce4-ae21-44e627959fee')>Internal Approval Workflow History</a>") This query string filters for the item ID, List ID and Workflow Association ID. This results in a clean display of the workflow history for each item. I created a new view in the workflow history list called 'Audit View', which contains the relevant fields that would be useful for auditing purposes. The query references the List ID as well as the workflow association ID. In this case we had several workflows associated with the list, so it was important to distinguish them. The end result is a column with hyperlinks to the workflow history for each item. *When you create the 'Audit History' calculated column using a 'Date/Time' data type this will justify the link text to the left side of the column ​




Posted on:
Categories: Business
Description: A good disaster recovery program should be like a good insurance policy: you put it in place, hope you never have to use it, and then enjoy peace of mind until you perform a periodic check to make sure that your policy still aligned your business needs. The good news is that with recent advances in technology and the support of skilled providers, disaster recovery really can be this painless.
A good disaster recovery program should be like a good insurance policy you put it in place, hope you never have to use it, and then enjoy peace of mind until you perform a periodic check to make sure that your policy still aligned your business needs. The good news is that with recent advances in technology and the support of skilled providers, disaster recovery really can be this painless. Softlanding has been very busy over the last several months introducing clients and implementing Microsoft Azure Site Recovery which now has full support for any combination of Hyper-V, VMWare & Physical Servers with both Microsoft and Linux operating systems. The simplicity of the Azure Site Recovery solution has really resonated with our clients and the adoption rate is skyrocketing! Many clients ask how the costs compare with building a traditional secondary datacenter. Microsoft and IDC have come up with the following costs based on 100 VMs ​ ​A great way to look at leveraging the cloud for DR is again to compare it to car insurance…is you don't buy a second car and keep it in your garage in case you crash the first one… you buy insurance! The great news is once your disaster recovery solution in place and tested, your organization now has a clear migration roadmap to the cloud for running Infrastructure as a Service (IaaS) Virtual Machines… you can simply leave your tested VMs running up in Azure! If you are interested in seeing a live demo of restoring a failed VM into Azure in 10 minutes drop me a line at efontaine@softlanding.ca




Posted on:
Categories: SharePoint
Description:
​I am working on a project that requires me to pass the start and end date of a SharePoint calendar event to an external web service. This integration takes place in a remote event receiver and because of data integrity, it happens in the before events, namely ItemAdding, ItemUpdating and ItemDeleting. In before events, item field values are available in the AfterProperties collection. So to get the event start date, we would use AfterProperties[“EventDate”] and for the end date, we would use AfterProperties[“EndDate”]. Here comes the first challenge. If you do a ToString to either element, you get something like “2015-08-31T170000Z”. Don’t be fooled by the format (ends with a Z). The time will always be equal to the time the user selects in the new or edit event page but the time zone is not UTC. Before we do anything, we need to convert the field to a DateTime object. We cannot use DateTime.Parse because the method will treat the time as UTC time to begin with. We use DateTimeOffset.Parse instead. var eventDate = DateTimeOffset.Parse(itemEventProperties.AfterProperties[“EventDate”].ToString()).DateTime DateTimeOffset represents instantaneous time, i.e. when we do a parse, it will ignore the seemingly UTC time format. All that’s left is finding what time zone the 5pm is in. I don’t know what goes behind the scene but SharePoint determines time zone in the following order. First it will use the time zone set in your user profile, editable through the About Me link below your user name on the top right of all SharePoint pages. If you choose to use regional settings defined by site administrators, SharePoint will use the web’s regional settings (Site Settings > Site Administration > Regional Settings > Time Zone). In CSOM, we can get user profile properties from the PeopleManager object or through the REST API SP.UserProfiles.PeopleManager. There are two properties that are of relevance here, SPS-RegionalSettings-FollowWeb and SPS-TimeZone. The former is set to true if the user follows the web’s regional settings while the latter stores the description of a SharePoint time zone set by the user, e.g. “(UTC-0800) Pacific Time (US and Canada)”. Once a time zone is set, SPS-TimeZone will retain the value even if the user switches back to follow the web's regional settings. Although you can extract UTC offset from the time zone description, you need to include daylight saving time in your calculation. The proper way is to convert that to a time zone object, specifically the TimeZoneInfo object. The web service I am calling also requires the TimeZoneInfo object when you pass in a date/time. The TimeZoneInfo object has a method called GetSystemTimeZones. To list out all the system time zones, run the following in a console program foreach (var timeZoneInfo in TimeZoneInfo.GetSystemTimeZones()) Console.WriteLine(timeZoneInfo.DisplayName); The DisplayName property of a TimeZoneInfo object returns a similar description so we should be able to get a TimeZoneInfo object by looping through all the system time zones and matching the DisplayName with what returns from SharePoint user profile. Well, not so quickly. SharePoint uses its own time zone object that has a slightly different set of descriptions. You can list out all the SharePoint time zones in a console program using var spTimeZones = SPRegionalSettings.GlobalTimeZones foreach (var spTimeZone in spTimeZones) Console.WriteLine(spTimeZone.Description); If you compare the two sets of descriptions, the majority of them matches, assuming you have the latest system and SharePoint updates. There are some differences and they are shown below​SharePoint Time Zone ​System Time Zone ​(UTC-0800) Pacific Time (US and Canada) ​(UTC-0800) Pacific Time (US & Canada) ​(UTC-0700) Mountain Time (US and Canada) ​(UTC-0700) Mountain Time (US & Canada) ​(UTC-0600) Central Time (US and Canada) ​(UTC-0600) Central Time (US & Canada) ​(UTC-0500) Eastern Time (US and Canada) ​(UTC-0500) Eastern Time (US & Canada) ​(UTC-0500) Bogota, Lima, Quito ​(UTC-0500) Bogota, Lima, Quito, Rio Branco ​(UTC-0200) Mid-Atlantic ​(UTC-0200) Mid-Atlantic - Old ​(UTC-0100) Cape Verde Is. ​(UTC-0100) Cabo Verde Is. ​(UTC+0200) Athens, Bucharest, Istanbul ​(UTC+0200) Athens, Bucharest ​(UTC+0500) Tashkent ​(UTC+0500) Ashgabat, Tashkent We could build an internal mapping table which I would advise against or perform several rounds of matching. To begin, get the user profile time zone property and replace “ and “ with “ & “ and use that as a base for matching. The first round of matching will be an exact match, follows by a “contains” match if no matches are found. The “contains” match is two way, i.e. test if SharePoint’s time zone description contains system’s time zone description and vice versa. If there are still no matches, remove the offset component in both description and do a two-way “contains” match again. Changes to time zone and daylight saving information are deployed through system and SharePoint updates. This last round of matching is necessary to account for the fact that they may not be in sync or you may not have latest updates of one or the other deployed in your environment. That still leaves (UTC-0100) Cape Verde Is. You could add this time zone in your matching logic or ignore it you know for sure you don’t have users in that time zone. As mentioned before, if the user chooses to use regional settings defined by site administrators, SharePoint will fall back to the web’s regional settings. The following CSOM codes will return the time zone description of the web’s regional settings and you can pass that through the same matching logic to retrieve a TimeZoneInfo object. var regionalSettings = context.Web.RegionalSettings; context.Load(regionalSettings.TimeZone); context.ExecuteQuery(); spTimeZoneDescription = regionalSettings.TimeZone.Description; [Updated 2015-09-15] Microsoft has confirmed the issue with dates in AfterProperties not showing in UTC time for calendar lists which are what I used here. There are no issue with custom lists. Hopefully there will be a hotfix to address this.




Posted on:
Categories: SharePoint;Office 365
Description:
​If you look at an out-of-the-box display template like Item_TwoLines.html, you will notice ~sitecollection, a sharepoint token that gets replaced by the site collection url during rendering. $includeLanguageScript(this.url, "~sitecollection/_catalogs/masterpage/Display Templates/Language Files/Locale/CustomStrings.js"); It is best practice to use site collection token because you cannot assume site collection always start at the root. You can do something similar with CSS by using includeCSS$includeCSS(this.url, "~sitecollection/Style Library/MyCompany/Styles/global.css");You can also use replaceUrlTokens likevar imagePath = Srch.U.replaceUrlTokens("~sitecollection/Style%20Library/MyCompany/Images/");




Posted on:
Categories: SharePoint
Description:
​Sorry folks, it is not possible as far as I know. You cannot use EventFiringEnabled as it is not supported in remote event receivers. There is a way to work around it though. Let's say you register remote event receivers for item added and item updating. In item added, you would like to update a field and skip your custom item updating processing. What you could do is to put in the following check at the beginning of item updating. if (itemEventProperties.AfterProperties.Count == 1 && itemEventProperties.AfterProperties.ContainsKey("MyToBeUpdatedFieldInternalName")) return new SPRemoteEventResult Status = SPRemoteEventServiceStatus.Continue ; Unlike normal updating, the AfterProperties collection will contain your updated field only so you would know this is done through programmatically in your item added event.




Posted on:
Categories: Business;SharePoint;Office 365
Description: In the midst of keeping track of what's changing in the cloud, making decisions on what should you be doing with the cloud, and ensuring that you are cloud ready it is easy to forget one major aspect of adopting such a large change. And that one forgotten piece is usually governance.
In the midst of keeping track of what's changing in the cloud, making decisions on what should you be doing with the cloud, and ensuring that you are cloud ready it is easy to forget one major aspect of adopting such a large change. And that one forgotten piece is usually governance. Hopefully by now you have already come to understand the importance of governance for SharePoint. The discussion will always continue on what's the best way to create and promote governance, but the underlying importance of some form of governance is undeniable. You may already have a governance plan, but let's explore what are some of the new things you have to incorporate into your plan in order to accommodate the complexities of the cloud. For now I want to focus on three key elements How to handle new features Just recently we have seen two new features introduced to the cloud environment, Delve and Office 365 Videos. As the release cycle is shortened, we are going to be seeing more and more changes and updates coming to the cloud. Most likely, your current governance plan isn't set up to support a software model that sees new features being introduced constantly. As these new features pop up, there has to be a method of formally reviewing them and seeing how they fit into the current organization. This means that you have to have a role established within your environment to identify new features. Along with the role you then have to have a process defined on how to evaluate and if applicable, implement the feature. This role may not be a single person alone, but instead a person leading the discovery of features and then bringing them forward to the steering committee to review their effectiveness.How to handle sharing external content One of the greatest features of SharePoint Online is the ability to share content with external parties. This can be both exciting and scary. Once enabled this means someone within your organization could potentially share "internal" content with anyone that has an email address. Your governance plan must outline what your organization’s policy is for sharing content with external parties. Will users be able to share content externally? Who will make the decision whether a site can be shared externally? How are your users made aware of what can and cannot be shared externally? All of these are just some questions that your governance should be able to answer.How to handle storing content on-premises and in the cloud In many cases, organizations are going to be adopting a hybrid approach. Keeping some content on-premises and some content in the cloud as it makes sense to their usage scenarios. This hybrid approach now creates a potential problem. The end user is now potentially faced with yet another location where they may be sharing content. The last thing you want to do is confuse your users. Within your governance plan you should be outlining at a high level, the information areas that exist within your environment and what type of information goes in each of those areas, with specific examples of types of documents/content. Now this isn't just a part of governance, this is something that must be incorporated into end user training as well. This is just the beginning. As the evolution of the digital workplace carries on, we will have to continue to adjust the way we ensure that these tools have guidelines that fit our organization and its end users. At the end of the day it’s about empowering your users with tools that help them do their job, but at the same time ensuring there is governance in place to provide a clear direction for all.




Posted on:
Categories: SharePoint
Description: As term sets become increasingly interrelated, their design can benefit from formal data design thinking.
​IntroductionTerm sets often start out their lives with an isolated purpose but, before we know it, we have multiple areas of the organization depending on them. It is often only after they have taken on their enterprise role that the cracks in their design start to show. By that stage a lot of content is tied to them and that can complicate re-organization.Having arrived at that point more often than I'd like, with term sets that re-use from other term sets, I've wondered whether we could get more scientific about term set design and see red flags earlier. This post proposes that analysing the cardinalities in the domain model can reveal at least one such red flag.The outcome in a nutshell Implement a one-to-one relationship as a (reversible) third composite term set Implement a one-to-many relationship as a (non-reversible) third composite term set Generally, avoid implementing a many-to-many relationship as a third composite term set The first two are trivial but lay a foundation for talking about the third, which highlights an anti-pattern.Single IdentityIt will be assumed that you’re already following a practice of establishing elementary base term sets and then reusing from those, uni-directionally, into composite term sets, as discussed in Good Practice Update Single Identity in Taxonomies. The principle there is you don’t want terms with the same meaning being duplicated with different identities. Rather define them as simple, uncontroversial components in one place and then re-use those building blocks in other arrangements to address the various perspectives the organization needs.Start with a domain modelConstructing a domain model is an important starting point. This is not a technical model but a conceptual model of the real world, covering those things that are relevant to the current scope. If term sets are to fit the world naturally, they must be based on a model that properly describes the real world. Clear definitions are essential since they will help ensure that base term sets are constrained to single, unambiguous concepts. Unconscious mixing of concepts leads to downstream problems.The following will serve as a sample domain model for this post. In case you're unfamiliar with the crow’s foot notation here is how it should be read. The examples have been chosen so as to cover the three main cardinality types.One-to-many (1N)A Geopolitical Location contains zero or more SitesA Site is located in one and only one Geopolitical LocationOne-to-one (11) A site has one and only one General Manager A General Manager manages one and only one SiteMany-to-many (MN) A Site is operated by one or more Business Units A Business Unit operates at one or more Sites These relationships will differ across organizations. Perhaps in your business a Site is operated by one and only one Business Unit. The data model will have a direct impact on how your term sets are structured, so it’s important to ensure that it depicts your domain accurately. My argument is that the presence of any many-to-many relationship represents a red flag.With a domain model defined, we'll now move on to the corresponding term set patterns. Pattern A Implement a one-to-one relationship as a (reversible) third composite term set11 relationsips are straightforward to implement.If there is a need to represent the combination of two entities then introduce a third term set which corresponds to the associative entity between the first two entities. Re-use terms from the two base term sets into the composite one. One site (such as Calcutta Operations) has one and only one general manager (Sam Govender). Since this relationship is symmetrical, the term set structure can be reversed. General Managers may be placed beneath Sites or Sites beneath General Managers - whatever suits the purpose. Resist the urge to get away with only two term sets. Keeping the base term sets elementary is more scalable since they could be arranged into a variety of other combinations in other term sets.This is pretty obvious. Let's move on to the next pattern.Pattern B Implement a one-to-many relationship as a (non-reversible) third composite term set In this example, to be more realistic, we're using a base geopolitical locations term set that already has some hierachy of its own. Don't let that confuse you. Each term still corresponds to a single entity in the real world such as India or United Kingdom.In the case of one-to-many relationships, a third term set will also usually be introduced which corresponds to the associative entity. This starts out the same as for one-to-one relationships. In contrast, however, the top level terms here are always derived from the entity at the singular end and the bottom level terms from the multiple end of the relationship.  One geopolitical location (such as India ) has many sites (Calcutta Operations, Calcutta Sales Office).Pattern C Avoid implementing a many-to-many relationship as a composite term setIt is tempting to combine the elements of a many-to-many relationship into a composite term set, but doing so is usually a mistake. The term set story for many-to-many relationships is not great. SharePoint term sets are simple hierarchies and each term set can consume a term only once. In the above, I was therefore forced to create the last term as a duplicate of Global Sales Office and thus begins the slippery slope of losing single identity. We'd now end up with Global Sales Office documents tagged with different IDs across the landscape and so lose our ability to reliably locate all documents that pertain to that office.So what to do? In my opinion MN should resolve to separate term sets and fields in most cases. Instead of trying to combine the Business Units and Sites term sets into one I would keep them apart and present them as separate fields. There is a down side to doing so. The information worker could now pick the business unit Cosmetics and the site Calcutta Operations, which is an invalid combination. In pre-termstore days, many folk dealt with this situation with custom cascading dropdowns. In the termstore world, we've resorted to dealing with it by constraining the terms available beneath a parent term. Unfortunately, re-use does not play well with that approach in the many-to-many case, as explained above. So which is the lesser of two evils? Forfeit the single identity principle Allow invalid data combinations to be selected It may depend on the drivers in your environment but thus far I have always insisted that we uphold single identity. In my experience, losing that has far-reaching, negative impacts not just on current objectives but on our ability to improve information architecture over time.ConclusionStarting with a domain model helps to avoid pitfalls when designing data structures. This holds true for SharePoint term sets. In particular, beware of many-to-many relationships. They do not translate well into combined term sets.




Posted on:
Categories: SharePoint
Description: SharePoint list views that include projected fields can be provisioned via CSOM.
In the SharePoint web interface, projected fields for a lookup field can be specified on the field settings page. The extra fields from the lookup list can then be consumed in views. The following CSOM code can be employed to provision such views remotely. In this example I have a list named Registration which has a lookup field pointing to a list named Calendar. The internal name of the lookup field is calendarLookupField. The objective is to add two projected fields as illustrated above. using (var ctx = new ClientContext(url)) ctx.Load(ctx.Web, w => w.Id); ctx.ExecuteQuery(); var list = ctx.Web.Lists.GetByTitle("Registration"); var view = list.DefaultView; view.ViewJoins = "<Join Type='LEFT' ListAlias='cal'>" + "<Eq>" + "<FieldRef Name='calendarLookupFld' RefType='Id' />" + "<FieldRef List='cal' Name='ID' />" + "</Eq>" + "</Join>" view.ViewProjectedFields = String.Format( "<Field Name='eventId' Type='Lookup' List='cal' ShowField='ID' WebId='0'/>" + "<Field Name='start' Type='Lookup' List='cal' ShowField='EventDate' WebId='0'/>", _ctx.Web.Id.ToString("D") ); view.ViewFields.Add("eventId"); view.ViewFields.Add("start"); view.Update(); ctx.ExecuteQuery(); These names in the code are aliases, which I chose arbitrarily. Avoid using spaces, etc. when naming them. cal refers to the Calendar list eventId refers to the Id field of the Calendar list. start refers to the EventDate field of the Calendar list.Some learning I went through The documentation makes no reference to a WebId attribute on the projected field definition, but omitting it leads to a runtime error on the view "Unable to cast object of type 'System.Xml.XmlElement' to type 'System.String'". Thanks to Sergei Snitco for providing that insight here. It may seem weird at first that nothing in the code refers explicitly to the list that we're getting the projected fields from (Calendar in this case) . As per the MSDN reference at the end, this is because the lookup field holds that relationship implicitly. I intially started down the path of omitting the join expression from my code and relying on an implicit join. Then I noticed this advice on MSDN "We do not recommend working without a Joins element. You will maximize your solution’s chances of being compatible with future releases of Microsoft SharePoint Foundation by always using an explicit Join element."References List Joins and Projections https//msdn.microsoft.com/en-us/library/office/ee539975(v=office.14).aspx SP 2010 List Joins & SPQuery enchancements. Tobias Zimmergren http//zimmergren.net/technical/sp-2010-list-joins-spquery-enchancements




Posted on:
Categories: Office 365;SharePoint
Description:
Even though it's still possible (and sometimes tempting) to build and deploy SharePoint Sandbox Solutions for SharePoint Online (Office 365) they have been deprecated in favour of SharePoint Add-ins (previously SharePoint Apps). There is information provided by Microsoft in regards to the limitations of Sandbox Solutions but there are some undocumented limitations, especially when leveraging them in SharePoint Online. Recently I was tasked to build a simple web part that would create a basic CSV report of specific content in a site collection. Due to the timing and budget it was determined a Sandboxed Solution would be the best approach, as this web part would only be used on a temporary basis until a more permanent solution was implemented. The web part was built quickly and performed the following Iterated through all the sites and document libraries in the site collection Added audit related details to a CSV document to be stored on the site in a designated document library The feature that deployed the web part had a feature receiver associated with it that Created a document library to store the reports Created a SharePoint Group for securing the document library In an on premise SharePoint 2013 there was no issues at all with the solution and it functioned as expected. Tests were then performed on a SharePoint Online site where the solution worked as expected. Once moving into more detailed testing I started to run into issues. On one site collection I was unable to even activate the Sandbox Solution. I would receive an error "Sandboxed code execution request failed" with a correlation id. Further testing identified the issue being in the feature activation code that was creating the SharePoint Group. Initially I thought the issue might be related to the Resource Quota configured for the site collection, so I bumped that up with no success. Next I manually created the SharePoint Group (the feature receiver would skip the group creation if it already existed) and tried to activate the solution again. This time it succeeded. Comparing the two site collections I noticed the one that the solution was failing to activate on had dozens of SharePoint Groups, compared to five on my test site. So with a lot of SharePoint Groups trying to check if a specific group existed, and if not creating it caused issues. The next set of tests I performed was with the web part. I modified the source code to only generate the report from the content of a single site (instead of all sites in the site collection), that worked as expected. Next I changed the code from recursively navigating the site structure to iterating through the SPSite.AllWebs collection, which failed. I received the "Web Part Error Sandboxed code execution request failed. Correlation ID…" error message. I then went back to the SharePoint Online site that the web part was working on and I continued to add new sites and document libraries until the web part was failing there as well. As my test tenant was not as busy as the other one (where even the solution activation was failing), the web part solution was able to generate reports where similar site structure and content existed. This behaviour leads me to believe that there are further restrictions on the Sandbox code execution outside of the Resource Quotas. As the collections get larger (e.g. SPWeb.SiteGroups) interacting with them via Sandboxed code was failing as well as memory demands of the solution increased during runtime a threshold was being hit which was halting the process. It's possible that SharePoint Online has boundaries set on the User Code services used to execute Sandboxed solutions but I haven't been able to source any details on these. After hitting these unknown restrictions the best option to provide this basic solution would be a Provider Hosted SharePoint Add-in. This would allow for the creation of the reports without hitting any SharePoint Online limitations as well, removes any performance issues that could occur in a SharePoint Hosted Add-in.




Posted on:
Categories: SharePoint
Description:
When working with custom developed web templates in an on premise SharePoint environment sometimes it’s required to customize the navigation settings or add links to external resources to the navigation. One request was for all external resource links to open in a new browser tab. When the site is non-publishing based (e.g. a Team site) the navigation settings don’t allow for the configuration of opening links in a new tab/window. As the navigation was being configured as the site was being created it made sense to Create a site scoped feature for “Navigation Configuration” Add a feature receiver to the new feature Staple the feature in the web template’s Onet.xml definition file Inside the feature receiver I used the following code to add a new link (as a child link under a resource header) that opens in a new tab. var web = (SPWeb) properties.Feature.Parent; //create the new QuickLaunch heading var quicklaunchNav = web.Navigation.QuickLaunch; var headingNode = new SPNavigationNode(“Resources”, “”, false); //add the new navigation node as a child to the heading var node = new SPNavigationNode(“Link Title”, “Link URL”, true); headingNode.Children.AddAsLast(node); //update the navigation node properties to open in a new tab node.Properties["Target"] = "_blank"; node.Update(); web.Update(); I’ve done some research to see if there’s a CSOM or REST API method of configuring the navigation node’s target but it doesn’t appear to be supported yet, hopefully in the future that will be added.




Posted on:
Categories: Business;SharePoint
Description:
I’ve been using computers for more than 35 years now. Many years ago I started with a ZX-81 and occupied my family’s TV for hours, because this computer needed to be attached to a TV rather than a monitor. Since that time a lot has changed and I can still remember the moment when I started to use Microsoft Word to create documents during my study. If you remember these times as well, you will know what this post is all about. Try to think back on how you used your client applications like Microsoft Word or Microsoft Excel years ago. Most of us usually started the client application (which took some time as personal computers weren’t as powerful as they are today), clicked on ‘Open’ or used the assigned keyboard shortcut to open a file. We usually started the client application first and then opened the document that we wanted to edit by using the file dialog of the application. That’s what I call an application-centric way of working. Our focus has been on the client application predominantly. When we thought about editing a document, we usually thought about the client application that we needed to get the job done first. We used to have a different client application for every type of document that we wanted to be able to edit. Same is true for communicating with clients or colleagues. We still think on Microsoft Outlook first when we intend to send an email. Even if we are currently using a client application to edit a document, we think about another client application, when we need to send an email to a colleague in regards to the document we are currently editing. We are used to thinking about the tools that we need to get a task done and in IT business these tools have been our client applications. But the world has been evolving while we were working like this. Now that we are accustomed to our beloved application-centric way of working, the IT business has changed dramatically and that change has a remarkable impact on our tools and the way we work with them. The days of a single application for each type of document is gone forever. We are getting used to working in teams and we are getting used to thinking in terms of projects as well. We are getting used to collaboration, which currently is turning to ‘Social Collaboration’ as its next step of evolution. Modern platforms like Microsoft SharePoint provide elements that allow us to easily setup environments in which we can collaborate. Usually called ‘Project Sites’ or ‘Team Sites’, collaboration elements have a massive impact on the way we are dealing with documents or data. Although we can still start our beloved client applications and use the file dialog to navigate to the SharePoint library where documents are now stored, this way has become cumbersome and ineffective, and in my opinion very “old-school”. Collaboration is changing our old application-centric way of working to a context-oriented way of working.What does that mean – a context oriented way of working? To answer that question, let’s have a look at the common elements a Project Site or Team Site is supposed to provide. Usually Team Sites or Project Sites provide elements to save documents to, they provide elements to store lists of data (like contacts or appointments) and they provide elements that we use to quickly chat with colleagues to exchange information in an informal way. In other words they provide all the tools that are needed to collaborate on a project. Often they even provide tools that we can use to create or edit most kinds of documents. As modern Project Sites or Team Sites ideally provide all the tools we need to get our job done, the need to keep our beloved client applications becomes more and more insignificant and with this our accustomed application-centric way of working becomes insignificant as well. In a collaborative environment there should be no need to explicitly pick a client-application to be able to edit a document nor to edit data. Instead each team member should be able to stay in the context of the current project or the current team for as long as it takes to get the job done. The ideal Team Site or Project Site supports this context-centric way of working by providing all tools that are needed. Our task is still “Work on the technical documentation for project XYZ”, but with a context-oriented way of working we now navigate to the Team Site or Project Site first and simply click on the document or data that we want to work on. Because the Team Site or the Project Site usually provides all the tools we need, we don’t have to worry about applications anymore. We now focus on the context (our current task) and don’t care about the application (our tool) that we need to work on this task. I admit, this is a simple example, but let’s think a little bit further. If we need additional information from one of our colleagues, in a context-centric way of working we just use the chat element that the Team Room or Project Site provides to contact a colleague. In an application-centric way of working we would probably switch applications – maybe from Microsoft Word to Microsoft Outlook. This switch of application is also switching our context and we might even get sidetracked by quickly looking at other emails in our inbox as well – even if they are related to another topic! We can even think a little bit further. Modern Team Sites or Project Sites sometimes provide additional tools as well. I’ve seen Team Sites that provide elements to lookup technical or engineering standards, tax calculators or even complex business applications. Modern Team Sites or Project Sites provide these additional tools to enable users to stay within the context while they are collaborating with a team or working on a project. Switching applications usually means a switch of context which usually makes us work less effective. The best way to make us work effectively is to provide a context that we can stay within while we are working on a specific task. Based on my long-term experience as SharePoint Consultant the best way to ensure effective collaboration is to provide Team Sites or Project Sites, that include all the elements team members or project members need to work and to avoid context switches or even application switches.Sounds reasonable – but isn’t that too complex to maintain? I don’t think so! Teams usually know best what they really need in order to work effectively. It takes some evaluation and requirements analysis in the beginning, but after this has been done, templates for Team Rooms or Project Sites can be implemented and used to create Team Rooms or Project Sites. The effort that needs to be invested in requirements analysis and template implementation will be balanced by an increase in productivity very quickly. I have accompanied some clients from requirements analysis to template implementation and although a context centric way of working has been new for most of the employees, they adopted the new Team Sites or Project Sites and the new way of working very quickly. Of course a well-planned training did foster the adoption significantly. Now that we came to that point of our thought experiment, why not take it a step further? A context-centric way of working has another advantage that is worth consideration although it might not be an obvious advantage. Recently I spoke on ways to improve the mobile user experience of SharePoint at an event that we at Softlanding organized. Mobile devices are more and more becoming an important part of our daily business. Think about yourself are you using a mobile device to access company resources? Have you ever reviewed a document using a mobile device (like a tablet) while on the go? When using a mobile device to access company resources, you almost automatically switch to a context-centric way of working. You usually enter the Team Site or Project Site first, before you attend a chat, check reports or access a document. Although there are apps for mobile devices, they are mostly intended to provide offline capabilities for some types of documents. In an ideal Team Site or Project Site, even mobile devices should be able to use all the tools that the Team Site or the Project Site provides. In other words by providing all necessary tools, even mobile devices (regardless of their operating system) can be used to collaborate without any restrictions. In my opinion Team Sitess and Project Sites need to provide a working environment that supports all kinds of devices without restrictions. Only then a seamless integration of mobile devices into business processes can be accomplished. Mobile devices are starting to replace fixed desktop computers and according to latest numbers they have outnumbered them already. If you are currently working on the implementation of Team Sites or Projects Sites, it would be a wise conclusion to make them mobile-ready. A context-centric way of working does also affect data integrity, security and governance. Let’s have a concluding look on some important benefits for enterprises. If a Team Room or a Project Site is supporting a context-centric way of working by providing all necessary tools, project-related or team-related data usually stays within the context as well.What does that mean exactly? Why is that important for enterprises? Let me give you a simple example to illustrate that. If a team is using Microsoft Outlook to communicate, the team’s communication is stored outside of the Team Site and outside of the context that this communication belongs too. Emails and answers are stored in local accounts of the users. Even after a short period of time the context of a single communication thread gets lost and can’t be assigned to a topic of a team anymore. When thinking of data integrity, security and governance it is much better to keep data that directly belongs to a team or a project in the Team Site or the Project Site. Doing so, closed projects can be archived very easily without losing important data. In fact, Team Sites and Project Sites can be ‘frozen’ before they are being archived and even years later they allow to reproduce any decision that had been taken by a team. And what is true for communication threads is also true for any kind of team-related or project-related data. In a nutshell Team Sites and Project Sites not only should provide all work-related tools, but also every kind of information that is directly related to the team or the project should be saved internally as well. Improving the effectiveness of teams and projects is an important goal of a context-centric way of working, but enhanced data integrity, security and governance is an important goal as well and enterprises usually strive hard to achieve that. Modern intranet platforms like SharePoint provide most of the elements that are needed to create context-centric Team Sites and Project Sites out-of-the-box. Additional tools can be added by using the SharePoint add-in model in a future-proof and reusable way. The first step to move towards a context-centric way of working in your enterprise is to know about the requirements of your teams and simply start working with them on Project Site templates or Team Site templates that truly match their requirements.




Posted on:
Categories: Business;SharePoint
Description:
​Although Windows Phone 10 (or as it might be called when leaving the preview phase 'Windows 10 mobile') hasn't been released yet, everyone can install the technical preview by joining the Microsoft Insider program. Usually I don't install preview versions on my devices, but I must admit I was too curious about about Windows 10 mobile and so I joined the Microsoft Insider program and the so-called 'Fast Ring' and installed Windows 10 mobile (Version 10166) on my Lumia smartphone. A few weeks ago we had our event 'Workplace Anywhere - Be Mobile, Safely' and we did some life demos to show how 'Mobile Working' can look today. Now that I installed Windows 10 mobile to my smartphone, I was curious how the NINTEX forms I created look like on Microsoft's new mobile browser 'Edge'. I simply navigated to our demo site and accessed the 'Vacation Request' form I create for our demo. It wasn't really a suprise to me that the form didn't show up. Although Windows 10 mobile currently is a technical preview (in fact there are still some missing features and glichtes), I refused to believe, that Microsoft's new mobile browser Edge isn't able to show a common NINTEX form. That's why I started to investigate this issue. I started with having a closer look on the mobile form and it's settings. First I selected the mobile form by clicking on the 'Smart Phone' icon and after that I cicked on 'Layout Settings'. When I opened the 'Advanced' section, I noticed that there is a list with user agent strings. The NINTEX form is checking the user agent string, before it is rendered. As Windows 10 mobile is using a new mobile browser, Microsoft might also has changed the user agent string of its new mobile browser. The best way to check that was to look for a site which is displaying the user agent string. I decided to use this site and I opened it using Edge on my smartphone. This is what I saw As expected Microsoft's new mobile browser is using a new user agent string. Iu updated the list of user agent strings of my NINTEX form like this Next I saved the form settings and published the updated form to the SharePoint list. Time for a second try again I opened the 'New Item' link with Edge




Posted on:
Categories: SharePoint
Description: Learn how to customize Nintex forms with JavaScript and jQuery, using a leave request scenario as an example.
Introduction Recently, we had a client who wanted an interactive form to enter leave request information and dates. In this form, the user must be able to select multiple ranges of dates, and the form must calculate the number of working days excluding weekends and holidays. The solution we came up with used JavaScript and jQuery to do some simple lookups and calculations. Let's jump right into this challenge, and build it step-by-step from scratch!First Steps Recording the date ranges is easily done by using a Repeating Section control in the form like so Set both the Name and CSS Class of the date pickers to dateFromPicker and dateToPicker. To calculate the days between dates in individual rows, we can use a Calculated Value control with the Name and CSS Class set to days with the formula dateDiffDays(dateToPicker, dateFromPicker) + 1 like so We can also add a total days Calculated Value control with formula sum(days) to the bottom of the repeater section, such that we have this Add a Total Days and Days Off label, clean things up a bit, and then publish or preview the form. You should have a working form where you can calculate days between ranges, and see the total number of days So now we have a working form, but we want to polish it up a bit. First, lets remove the ugly borders from the repeater section Secondly, lets remove the alternating background color scheme that is applied appears when we have multiple rows in our repeater section Finally, we notice that the days column defaults as 1, even when there is no dates selected. Therefore, we can expand our days calculation to default to 0 when any of the date picker fields are empty using the formula If(or(isNullOrEmpty(dateFromPicker), isNullOrEmpty(dateToPicker)), 0, dateDiffDays(dateToPicker,dateFromPicker) + 1) Our form now looks like this Advanced Steps - Counting Weekends And Holidays The previous steps have been relatively straightforward. We now have a clean looking form that figures out how many days are between 2 dates. However, the days calculated does not take into account weekends or holidays We will first replace the dynamic Calculated Value days control with a simple Label control that we can write to. Set the new label's Name and CSS Class to days like before. To subtract weekends from 2 dates, we pass the value from the date fields into a JavaScript function that loops through the dates and checks if it is a weekend, and then returns the number of weekend days. Instructions on how to insert these scripts will be provided later. Let's first have a look at the getAllWeekendDays and isWeekendDay functions function getAllWeekendDays(date1, date2) // Initialize date objects and variables var date1Obj = new Date(date1); var date2Obj = new Date(date2); var numWeekendDays = 0; // Loop over the days between the date range while(date1Obj <= date2Obj) // Add one to numWeekendDays if the day is a weekend if(isWeekendDay(date1Obj)) numWeekendDays += 1; // Get next day date1Obj.setDate(date1Obj.getDate() + 1); return(numWeekendDays); function isWeekendDay(dateToCheck) var dateObj = new Date(dateToCheck); // The Date.getDay method returns 5/6 for Saturdays/Sundays if(dateObj.getDay() == 5 || dateObj.getDay() == 6) return true; return false; To get the number of holidays between 2 dates, we first need to define which days are holidays. We first create an OOTB SharePoint Calendar list named Holidays in the same site as the Nintex form, and then add some holidays to it Then, in your form use this JavaScript function to get the holiday daysfunction numberOfVacationAndWeekendDays(date1, date2) // Use Deferred Object to register callbacks of sync/async functions this.d = $.Deferred(); this.Date1 = date1; this.Date2 = date2; // Prepare the list query to use on the Calendar list var query=@"<Query><Where><And><Geq><FieldRef Name='EventDate' />" + "<Value Type='DateTime' IncludeTimeValue='false'>" + date1 + "</Value></Geq><Leq><FieldRef Name='EventDate' /><Value Type='DateTime' IncludeTimeValue='false'>" + date2 + "</Value></Leq></And></Where></Query>"; // Get the client context var clientContext = new SP.ClientContext(); // Get the SharePoint list named 'Holidays' from the current site var oList = clientContext.get_web().get_lists().getByTitle('Holidays'); // Prepare the final list query var camlQuery = new SP.CamlQuery(); camlQuery.set_viewXml("<View>" + query + "</View>"); // Query the list using defined query in an asynchronous call this.listItems = oList.getItems(camlQuery); clientContext.load(listItems); clientContext.executeQueryAsync(Function.createDelegate(this, queryOnSuccess), Function.createDelegate(this, queryOnError)); return this.d.promise(); function queryOnSuccess(sender, args) // Count the number of calendar list items found between the date ranges this.numDays = this.listItems.get_count(); currentHolidayNum = this.numDays; var listItemEnumerator = this.listItems.getEnumerator(); var alertMsg = ""; // Loop through all items found while (listItemEnumerator.moveNext()) var oListItem = listItemEnumerator.get_current(); var eventDate = oListItem.get_item('EventDate'); var allDay = oListItem.get_item('fAllDayEvent') if(allDay) // If an item is fAllDayEvent = true, then the date returned is not in a 'standard' form // We must use the toISOString method to convert the date to ISO standard eventDate = eventDate.toISOString(); // If this holiday is on a weekend, subtract 1 so it won't get counted twice if(isWeekendDay(eventDate)) currentHolidayNum -= 1; // Get all weekend days between the input dates var numWeekendDays = getAllWeekendDays(this.Date1, this.Date2); // Return all holidays (that are not on weekends) + all weekends this.d.resolve(currentHolidayNum + numWeekendDays); function queryOnError(sender, args) // Show error message if query fails this.d.reject("An error occurred!"); alert("Error encountered while trying to calculate the number of holidays and weekend days."); Credits to Ryan McCarney for generously providing the scripts above. Thank you!Advanced Steps - Attaching Custom Events To Nintex Date Pickers To attach scripts to your Nintex form, the recommended way is to save the script to a .js file, and then uploading them to your site assets folder like so Then, in your form's settings, reference the uploaded files like so Now, we have the code saved to file and referenced from our forms. The next step is to attach the functions in the scripts to our date picker controls. This is so that when a user selects dates from the popup calendar, an asynchronous event fires and attempts to retrieve the number of weekends and holidays and update the days column appropriately. To do this, we have to modify the default date picker controls and append our function calls to it's onSelect attribute. After some digging around, I found that the date picker control is initialized by Nintex by taking the nfFillerLoadFunctions stack and popping the first item out, which contains the function to create date picker objects. Therefore, we can 'hijack' this stack by popping out and discarding the default date picker control, and then inserting our own// Our own datepicker initializer function, copied from the default Nintex date picker with modification to the onSelect property function nfInitDatePicker() NWF$(".nf-date-picker").datepicker( showOn "button", buttonImage "/_layouts/15/images/calendar.gif", buttonImageOnly true, changeMonth true, changeYear true, showOtherMonths true, selectOtherMonths true, showButtonPanel true, dateFormat "m/d/yy", nextText "Next", prevText "Prev", buttonText "Select a date.", currentText "Today", closeText "Done", monthNamesShort ["Jan", "Feb", "Mar", "Apr", "May", "Jun", "Jul", "Aug", "Sep", "Oct", "Nov", "Dec"], monthNames ["January", "February", "March", "April", "May", "June", "July", "August", "September", "October", "November", "December"], isRTL false, dayNames ["Sunday", "Monday", "Tuesday", "Wednesday", "Thursday", "Friday", "Saturday"], dayNamesMin ["Su", "Mo", "Tu", "We", "Th", "Fr", "Sa"], dayNamesShort ["Sun", "Mon", "Tue", "Wed", "Thu", "Fri", "Sat"], firstDay 0, onSelect function() var that = NWF$(this); this.fireEvent && this.fireEvent('onchange') || that.change(); if (that.blur) (that.blur()); // Trigger event to calculate leave days DatesAddedOrChanged($(this).closest(".nf-repeater-row").index()); , beforeShow function(input, inst) ); // If the date control DOM element disabled, then diable the datecontrol functionality // This is specially when the control is disabled by default var dateControls = NWF$('.nf-date-pickerdisabledvisible'); var dateControlsCount = dateControls.length; for (k = dateControlsCount - 1; k >= 0; k--) NWF$(dateControls[k]).datepicker('disable'); NWF.FormFiller.Events.RegisterAfterReady(function () nfFillerLoadFunctions.pop(); // Discard the default datepicker initializer nfFillerLoadFunctions.push(nfInitDatePicker); // Insert our own datepicker initializer ); function DatesAddedOrChanged(repeaterRowIndex) // Get date from and date to from the current row var dateFrom= NWF$(".dateFromPickereq(" + repeaterRowIndex + ") input").val(); var dateTo= NWF$(".dateToPickereq(" + repeaterRowIndex + ") input").val(); // Get total number of holidays and weekends var numHolidaysAndWeekend = numberOfVacationAndWeekendDays(dateFrom, dateTo); // Set the value of the 'days' label control numHolidaysAndWeekend.done(function(result) // Subtract holidays + weekends from total days. Please implement your own dateDiff() function https//stackoverflow.com/questions/3299972 // See (1) below for more details var totalOffDays = dateDiff(dateFrom, dateTo) - result; if (totalOffDays <= 0) NWF$(".dayseq(" + repeaterRowIndex + ") input").val(0); else NWF$(".dayseq(" + repeaterRowIndex + ") input").val(totalOffDays); ); numHolidaysAndWeekend.fail(function(result) var error = result; console.log(error); //handle errors alert("Error encountered while trying to calculate the number of holidays and weekend days."); ); Wrapping Things Up The above solution will work nicely when you need to check a range of dates for certain values like holidays and weekends. With some additionally programming, you could modify it further to check other things like if someone from the same department is already on vacation, or "black out" regions where managers do not allow vacation to be taken. You could then decide to show these restricted days as an alert to the user, cause the date fields to fail validation with an error message, or just automatically deduct the days and append a comment to an additional comments field. If you want to take it another step further, you could calculate the number of vacation days accrued since the user started work up to the vacation dates, and see if the requested days would be more than what was accrued. You can store/retrieve the employee's employment date in a SharePoint list or BCS connected external list, query a HR system's data using a web service call in jQuery, or query an AD source using a LDAP query. Choosing which source to use depends entirely on your company's data infrastructure and what would be the most cost effective to implement. The possibilities are endless when using the JSOM together with JS and jQuery. (1) In our delivered solution for our North American client, we also added code to check what locale the user is in so that we can properly initialize date objects from ambiguous text inputs. We do this because American users tend to place months before days, while Canadians gravitate towards days then months. Checking the locale is important so that we capture the right dates and do the right calculations. We also have additional code to do field validation, and to deduct dates in a way that accounts for leap years and different time zones. I have left these code snippets out for brevity but let me know in the comments if these are important to you and I can start a new post or put them here! In my next post, I will talk about presenting SharePoint forms in a way that allows users to print it cleanly to paper. I will also show how to add friendly "Print" links/buttons in SharePoint lists, and how to link directly to forms with added custom messages and also pop-up the print prompt.