Is Microsoft Flow the replacement for SharePoint Workflows?

I recently had the privilege to co-present a session on Microsoft Flow & PowerApps at the Microsoft Beyond US Roadshow in Hartford. I am a huge fan of Microsoft Flow and have done several sessions on showing how you can orchestrate data across Dropbox, OneDrive, SharePoint & Salesforce with clicks & not code. One of the attendees in my session asked a very common question that I thought would make a good short blog post: “Is Flow the replacement for SharePoint Workflows?”.

Over the past few years I have built dozens of business applications leveraging the SharePoint platform to route requests through approval processes, provided metrics for turnaround time on requests, and automated non-value added steps. These solutions undoubtedly would leverage the SharePoint workflow engine for sending e-mails, assigning tasks, etc. Since Microsoft introduced the Workflow Manager in SharePoint 2013, there has not been any additional enhancements to their workflow engine. Compounded with the fact that SharePoint 2016 did not include an updated version of SharePoint 2016 it would make sense to assume that Flow is the replacement for SharePoint Workflows.

However, I would argue that Microsoft Flow is really positioned as the next generation of business process management applications vs an outright replacement to SharePoint Workflows. From a feature parity perspective not all of the SharePoint Workflow actions are available in Flow (yet they seem to be added all the time). At the time of this blog there aren’t the basic string manipulation actions, or copy items (also not available in 2013 but are in 2010), content approval/publishing, check-in/check-out, and wait for field changes in list items. There is also the caveat that in order to access on-premises data that you would need to setup a Gateway in order to make it accessible to Microsoft’s cloud.

Microsoft Flow provides much more capability than SharePoint does which might initially frighten some Enterprise customers. While there is a lot of value in being able to orchestrate data across both line of business & public clouds there definitely needs to be some up front planning to ensure that you do not jeopardize the integrity of your company’s data. For example it is absolutely possible to develop a Flow to copy files from your OneDrive for Business to your personal DropBox.

Finally, from a licensing perspective Microsoft Flow is a pay by the drink kind of service (technically pay by the Flow run). There’s a bit of  math but essentially you are allocated an allotment of Flow runs per user in your Office 365 tenant based on your plan. Be sure to check out Microsoft’s Flow Pricing page for up to the minute guidance. Whereas with SharePoint Workflows it’s essentially as many workflow runs as what your infrastructure can support.

So getting back to the original question – is Microsoft Flow the direct replacement for SharePoint Workflows? In my opinion – No. Microsoft Flow is the evolution of business process management allowing you to build elegant solutions which have the ability to orchestrate data across various line of business applications leveraging “clicks” and not code. Combined with PowerApps as your mobile/responsive front-end the barrier to creating enterprise applications has absolutely been lowered to where you no longer need a team of developers to create basic applications.

Hope this helps & happy Flow-ing.

 

 

Advertisements

Creating Google Charts with the SharePoint Search API

Recently I had a client ask me to build them some lightweight BI reports because their enterprise team was too consumed with other priorities. I started to think about pulling in a 3rd party JavaScript charting library since they mostly wanted to report against 1-2 columns of a list. However one of the challenges I faced was this library had close to 10,000 items and there would be views which would contain more than 5,000 items.So instead of leveraging the lists endpoint for the REST API, I decided to get the data via the Search API which would overcome that 5,000 item limit.

Since I can’t disclose client data, I mocked up the same solution using data I found on the CT State Data site. I created a new list from the CSV and let it go until I hit over 30,000 list items:

state-listitems

The list had a few interesting columns – Agency, Department, Job Title, Compensation Type, & Amount.

state-listdata

I thought it might be interesting to build a quick report for how my tax dollars were being spent by agency. Since I would be leveraging Search for pulling in the data, I first needed to map a crawled property to a managed property. I picked one of the out of the box managed properties and mapped it to OWS_Agency:

state-managedproperty

After a few hours the managed property became available as part of the Search index.

Next, was the easy part – I added a Content Editor Webpart to a page and pointed it at an HTML file which would pull together the solution.

The complete details of the HTML file can be found below but I just want to highlight a couple of the important parts.

1. When I get data from the Search API I specify that I want ContentTypes of type Item (would support custom as well) and then say I want to pull back RefinableString102 as part of the refiners set so I can see how many items per State agency
/_api/search/query?querytext=’ContentType=Item’&refiners=’RefinableString102′

The refiners work the exact same was using the Search REST API as they do when you perform an actual SharePoint Search – it returns results that match the criteria you specify. One of the bonus features is by saying you want to pull back refiners, you immediately get a summary of the different values for that refiner. So for our instance, I’m able to report on all the different state agencies by using the RefinableString102 refiner. Within that, it found 67 different agencies and then gave me the count of how many expenditures there are per agency.

In a production environment, you would probably want to also limit the query to only return results from that particular site collection or site, but the purpose of this blog post is to hopefully give you a very easy to follow along with example.

2. When I make the AJAX call the JSON returned back is in a complicated hierarchy. I took a screenshot so you can see the structure:

state-xdata

xData.d.query.PrimaryQueryResult.RefinementResults.Refiners.results[0].Entries.results[0].RefinementCount gives me access to 3999 which is how many expenditures were issued to the Board of Regents agency.

RefinementName – the agency name from the data set
RefinementValue – the number of items that are set to that particular agency

3. Because I’m creating the table from data returned from the Search API – I needed to initiate a new DataTable for Google Charts and then iterate over my JSON data to fill in the cells.I told it how many rows I would have (the length of the results) and then I used a for loop to fill in the table values.

4. I set the title of the chart of “State Budget” but creating an Options object.

5. I then create a new variable called chart which initiates the google.visualization.ScatterChart function and pass it the div on the page where you want to render the chart.

Here’s what the Google chart looks like with all the state data loaded:

state-googlechart

Full HTML file on Github

state-html-file.PNG

 

The New SharePoint Framework Meets Corporate IT

As a consultant there are times where I am slightly disconnected from the “Corporate World”, which is honestly part of the draw. As consultants technology is the great enabler for providing clients with software & services in order to help them realize their business goals. So as a consultant we tend to gravitate towards both trends & solutions that help us provide more value in a shorter amount of time. Demonstrating value is what helps us to build relationships which in turn often times leads to repeat business – rinse, lather, repeat. However, in the Corporate World life is filled with words like standards, policies, approved lists, governance, compliance, legal, etc.

I have recently resumed working for one of my favorite clients at a large Insurance company in our area, which entails the usual badge, laptop, parking pass in order to comply with the contractual agreement between my company and theirs. One of the interesting points to note is that in order for me to perform development work, I actually need to use their equipment to access their environment. This is a rather large SharePoint business application that I had previously built which is a combination of code (all client side) and configuration with SharePoint 2013 workflows. During my previous engagement I had actually been given local administrative rights on my machine which allowed me to load software like Visual Studio Code in order to perform my job.

Fast forward to this engagement, I’m without Administrative Rights on my machine and my two text editors of choice are Notepad or SharePoint Designer. I walk down to their Tech Express counter to see about having them install Visual Studio Code on my machine and they empathize but explain that their new policies prohibit the installation of any software that hasn’t already been packaged by their Engineering group. When I inquire about that process it basically entailed a multi-step review by Legal, Compliance, Software Engineering, etc. Typical turnaround time is a couple of weeks which likely isn’t helpful when the work needs to be done in the next week or so. I thank the Tech and proceed back to my desk where I submit a request for Administrative Rights on my machine thinking at least if I can get that, I’ll load the software myself and be done with the work hopefully before their security group catches on.

My request is routed to a governance approver who expresses concerns at both my status as a non-employee but also my request (setup developer laptop is my vague request). We speak more about it and I learn that not only is the technician correct about the software process but he also proceeds to inform me about the extra attention that is placed on Open Source software. He explains that the EULA agreements can be confusing and sometimes publishers make distinctions between personal & corporate use which can have implications on who owns the intellectual property. We talk about the way development is going with open source technology such as NodeJS, Bower, Gulp, and installing Node packages to help streamline development and I’m met with the realities of Corporate IT. The main priority for Corporate organizations is to protect the company – functions like Legal, Human Resources, and event Information Technology have the initial marching orders to protect the company. Just because development trends are moving towards open source technology – big corporations are afraid of this from a support & licensing perspective. From a security perspective they also do not like things such as local web servers and folks having persistent administrative rights to load the newest packages from Github onto their machines.

So where am I going with all this? So for today’s SharePoint Development there area really a few choices:

1) Configuration with mostly out of the box functionality and perhaps customization using SharePoint Designer. (workflows, maybe a little jQuery here and there)

2) Client side development where you’re hopefully installing some sort of text editor like VS Code, Brackets, etc to write a combination of HTML & JavaScript. If you are able, you might include things like NodeJS, Gulp, Bower, etc to automate portions of your development life cycle.

3) Visual Studio (heavy) development – either server side code, sandbox solutions, Add-in model, or maybe even client side development since it does support that as well. 🙂

What I see as an interesting problem which will impacts both SharePoint as well as Web Developers in these large organizations is the trend towards NodeJS, Gulp, Yeoman, and whatever other Node modules/apps become popular. Loading these into the global scope can require administrative rights which is often times not given out. Furthermore to make it more difficult, you’re not just asking to have Visual Studio Professional loaded on your machine. With just Visual Studio, Enterprises can provide temporary admin rights or use policies to allow you to just run Visual Studio as an Administrator. There’s also the very clear licensing agreement between Microsoft & “Large Corporation”.

But the there is the argument that we are talking about “web development” and not really SharePoint development which I agree to. But if you think about it – web development is kind of that grey area because from what I have seen, most large organizations treat their external websites much differently than their corporate Intranets. The Internet site represents the company, the brand, & services both customers & potential candidates alike. Many times agencies are brought in to help with these from the design perspective and/or development resources are brought on to “build it”. In those situations the developers likely fall into the same category as me – the consultant types which bring our own equipment with no barrier to entry. SharePoint is on the corporate intranet, inside the firewall, maybe sometimes in O365. Microsoft touts the #’s of organization’s in the Cloud but for me most of my main clients are on-premises.

I’m not saying that I am against the new “SharePoint Framework” what I’m saying is, I think there’s going to be a market to cater to (being large enterprises) where perhaps there’s a less automated way – or better yet a more contained set of tooling for those developers. To scoff and say that large enterprises need to work towards embracing the way things are going is difficult as well. Going back to my earlier point, most of those departments (HR, Legal, IT, etc) are there to protect the company.

I don’t have the perfect solution but I can tell you that I’m curious to see how this unfolds.

Yammering about the new O365 Community

For those of you who know me, I’m a little sarcastic with a pretty dry sense of humor. Friday night I was scanning my Twitter feed and noticed that Naomi Moneypenny had put out an article on the new O365 Community and how it was not on Yammer. I had seen the blog post out on the Office Blogs site and had tagged it as something I was going to peruse during my Saturday morning coffee & catch up on news time. I’m a big fan of Naomi, she’s absolutely brilliant and her observations are incredibly insightful – so I decided to check her post out before continuing on with my Friday night plans of yard work & various chores. As I had suspected it was really sharp article and she pointed out a really valid point for why it makes sense to now have the O365 Community in Yammer – it completely eliminates the barrier to entry. You can choose to browse it anonymously or register with your Microsoft account so that you an engage in the community. She also pointed out that by enabling anonymous access would mean that the search engines would be able to index the content as well, making it even more valuable for those looking for information about O365.

I have a love/hate relationship with Yammer and I’m slightly conservative when talking to clients about it. From my perspective it can be an incredibly useful tool within a Corporate environment where you have the appropriate Power User base in place to ensure that answers and/or ideas are being captured and somehow logged into a knowledge management tool to help eventually break a cycle of everyone blasting out questions to an organization. I like it for the remote “in the field” workers that are capturing information and posting it for those at their desks to analyze and provide input where necessary. I’m not a huge fan of it for the large scale “chaotic” implementations similar to what we have with the O365 Network & Groups. Honestly, what we have today is kind of a mess.. There are groups for different things – Development, IT Pro, Client-side Development, Patterns & Practices, the list goes on and on. I’m not against having all those groups but finding groups is a bit of guessing and/or wasted time searching. There’s also lack of real ownership which is kind of the Yammer model, but this doesn’t always work with large groups. There’s the occasional message about, “can’t wait to see where <insert person here> takes this group” – which undoubtedly leads to 1-2 posts by that person and then they get busy with life. 🙂

Now here’s where my Friday night went oh-so wrong.. In Naomi’s post she provided me with the ultimate obnoxious question setup:

Screen Shot 2016-07-17 at 8.42.11 AM

Maybe it was the really chaotic week, or the cycle of the moon, or the fact that my sarcasm gets the best of me, but I just couldn’t help but responding back on Twitter with the following:

Screen Shot 2016-07-17 at 8.45.39 AM

Sure it seemed innocent at the time and knowing Naomi I thought she would have gotten a chuckle out of it but what I did not realize is that my quiet evening was going to turn into a chaotic mess of likes, retweets, responses, threads & subthreads. At one point I’m pretty sure Al Gore jumped in and let us know that he actually invented Yammer.

Looking back through the thread I identified the themes of the arguments for why this isn’t a good move as:

– Investments have been made already in the existing O365 Yammer Network
– This is yet another place to monitor questions/answers/information
– Not hosting this on Yammer means that Yammer is dead
– The “Community” didn’t like not having input on the decision

These are all very valid points and I have a couple of comments for them:

1) Investments have been made already in the existing O365 Yammer Network
From my perspective the life of an IT Professional is constant change & evolution despite investments made in products. I can recall a huge ERP implementation that had teams churning through millions of dollars over the course of a few months to implement a new system and having their project halted right before Go-Live. This was because of a pending merger which already had licenses for the product and the decision pivoted to waiting until after the merger was completed. (1-2 year delay). For this instance, yes there was absolutely time & energy put into building out these Yammer groups but to quote Naomi, the penetration level was less than 100,000 users on the network which is an incredibly small fraction of the SharePoint & O365 users out there. I’m not trying to be too critical but you could argue that the network was a failure based on the #’s alone. Obviously you can always argue no matter what side of the coin you’re on, but in terms of a percentage of install base vs engaged, the Yammer network simply doesn’t have the reach that Microsoft would like to have. Don’t forget, Microsoft’s success is based on adoption & consumption of services – ensuring adoption & consumption is completely tied to their bottom line.

Just to build on that point the O365 IT Pro network is only 13,329 (as of 9:22 AM EST on 7/17/16) and that’s the second biggest group in the network!

Screen Shot 2016-07-17 at 9.21.58 AM

2) This is yet another place to monitor questions/answers/information
Yes. I absolutely cannot argue with the fact that this is yet another place to monitor for SharePoint/O365 questions, answers, and announcements. However, I’m curious what the impact is to those that are going to use this platform mostly for consuming information & announcements vs those community leaders who are publishing information. What I mean is – take user Joe working at a large company who just wants to learn more about Planner and how it impacts him. Now instead of hunting across Stackexchange, MSDN forums, MVP blogs, etc. – he can go to this one Community Site (choosing to login or remain anonymous) and follow information put out by Microsoft or members of the community. This is a much different experience than say Rob Windsor who is an amazing Community leader in the Development space.

Rob made the comment here:

Screen Shot 2016-07-17 at 9.08.01 AM

As someone on the content producing, knowledgeable, and sought after – I can absolutely see how yet another community can lead to eye rolling and perhaps non-participation. From my perspective though, Rob’s more of the exception to the rule. I can empathize the additional overhead this can add to his participation in the community, though I would ask if perhaps at some point he would consider dropping out of one of the other channels in place of this new one once it’s beyond the Preview period.

3) Not hosting this on Yammer means that Yammer is dead
So I have to chuckle because the Yammer being dead comment is really what caused my Friday night to go from quiet to phone constantly buzzing with Twitter updates. 🙂

Yammer is no different than any other solution in that it was developed to match a specific use case. I feel that Yammer works in Enterprise environments with governance, with governance, with people dedicated to ensure that individuals who are going to get the most out of the platform. I personally did not like the O365 Network on Yammer, I found it to be a hot mess of groups all over the place.

Take a look at the splash screen – it’s so much easier to navigate towards the type of content you are looking for in the new network:

Screen Shot 2016-07-17 at 9.19.08 AM.png

Versus the Yammer experience is a hodgepodge of trying to scroll through find the groups that are most relevant. This is a nice clean modern interface for getting information about what you’re looking for – whether it be Yammer, SharePoint, Office Apps, etc.

Also to address the other elephant in the room about Yammer being “dead”, I can definitely see how folks might interpret that not using it means that Microsoft is pulling the plug. However, I think that the mass of O365 users would make it very difficult for Microsoft to just pull the rug out from users. What I think we might be seeing is a true convergence of SharePoint, OneDrive, Yammer, and Groups into something else. I don’t have any insider info on this one. But if you look at the power of Groups, the re-energizing of SharePoint, and the rich capabilities of Delve – it feels to me like something big might be coming which really ties them all together with Yammer perhaps somehow either being reborn or updated to help complete that picture. I can only see Yammer becoming more integrated vs being killed off.

4) The “Community” didn’t like not having input on the decision
So I consider myself to be part of the O365/SharePoint Community – I run a user group and I speak at SharePoint Saturdays when I can. It is true that nobody asked me if we should switch from the O365 Yammer Network to this new platform. When I think about the community though, I gravitate towards the people that are consuming announcements, blog posts, asking questions & hopefully getting answers. Running the user group, I find myself really striving towards trying to build an inclusive environment where people feel comfortable asking questions & getting help with their problems. Looking at the the network I think this was Microsoft’s intent as well – they wanted to make it easier, organize the content, eliminate any additional headaches with Yammer accounts, and provide a nice modern experience. I would also say that the numbers in the Yammer network don’t accurately reflect the true volume of community members. If you remove the Microsoft employees, the MVP’s, and the evangelists, you’re really talking about very small numbers of network members vs true community members.

So are there people who are active in the O365 Yammer Network? Absolutely! Might they have concern or mixed feeling about moving? Sure! Rather than look it as a negative, I would challenge those who have some concerns to look forward to hopefully a more rewarding experience if Microsoft is able to grow the number of active users in this new community. I intend on announcing it during our next CT SharePoint User’s Group meeting, and I would encourage those who either run or participate in local user groups to do the same.
To quote Dux Raymond Sy – “Shift happens”. 🙂

Changing a site’s master page using REST via SharePoint Designer Workflow

As a follow-on to my previous post about how to create subsites using the SharePoint REST API, I encountered a scenario where a customized master page was already in existence and I needed to apply that to subsites created with my provisioning workflow.

<insert quick note about Microsoft Patterns & Practices>
Microsoft’s Patterns & Practices are awesome as they provide you with prescriptive guidance for how to customize SharePoint without encountering future collisions with the product team’s development path. However, there may be times where you deviate from that guidance (such as not touching the master page) and require a solution for changing the master page.
</end PNP guidance>

So, let’s get down to business.. You’re going to want to build a few dictionary variables:

1. MetadataMasterPage

Name Type Value
type String SP.Web

2. RequestHeadersMasterPage

Name Type Value
accept String application/json;odata=verbose
content-type String application/json;odata=verbose
X-HTTP-Method String MERGE
IF-MATCH String *

3. JSONMasterPage

Name Type Value
__metadata Dictionary Variable:MetaDataMasterPage
MasterUrl String /sites/yoursite/_catalogs/masterpage/oslo.master
CustomMasterUrl String /sites/yoursite/_catalogs/masterpage/oslo.master

*Now clearly you wouldn’t always point your MasterUrl & CustomMasterUrl to oslo, this is just an example. You would provide the actual path to the master page you are setting.

Another point – there’s two master pages – the Site Master & System Master. See the image below to figure out which one is which:

(Also note – you have to include both in your dictionary variable even if you’re just setting one to a different value)

masterpage

Now all you have to do is create an App Step and include a “Call HTTP web service action” within it.

The URL is going to be: http://pathtotheURL/_api/web and the HTTP method will be POST.

masterpage_web_service

Click the OK button, then right click the workflow action and select properties.

1. RequestHeaders = Variable:RequestHeadersMasterPage
2. RequestContent = JSONMasterPage
3. ResponseHeaders = new Dictionary, call is ResponseHeadersMasterPage

MasterPage_Web_Service_Details

That’s it! Go ahead and run your workflow and behold the power of the SharePoint REST API. It’s awesome! 🙂

Here’s what the full workflow looks like with my URL blurred out:

MasterPageWorkflow

Creating subsites using REST API from SharePoint Designer Workflow

I absolutely love SharePoint 2013 style workflows solely because of the Call Web Service action. It is hands-down one of the coolest features as it allows you to build some really amazing solutions.

I’m documenting this for my own benefit but if you happen to be in the same boat trying to build your own solution – I hope this helps!

Starting point: make sure that you have configured workflows to be able to run with elevated privileges in your site collection. Rather than re-invent the wheel, just follow this Microsoft article. From there, create a brand new Reusable workflow. The first step will be to create a variable called RESTUri of type string and have it be combining the Workflow context current site URL + /_api/web/webinfos/add. This is going to be URL that the Call Web Service will access:

subsites1

Next you have a few Dictionaries to create:

1. RequestHeaders which will contain both Accept & Content-Type with the value of: application/json;odata=verbose

subsites2

2. Metadata which will contain just a single entry of type as string with the value: SP.WebInfoCreationInformation

subsites3
3. JSONRequest which will contain a bunch of different values:

Name Type Value
Url String This is the site name in the URL example: http://sharepoint/sites/site1/subsite (reference the list column value)
Title String The Site Title (example: Jared’s Awesome Site)
Description String The Site Description (example: Jared’s Awesome Site is a world class SharePoint blog)
Language Integer 1033 (for English, others here)
WebTemplate String STS#0 (Reference this blog for more choices)
UseUniquePermissions String true/false – if you want to inherit permissions go with false
__metadata Dictionary Workflow Variable:Metadata (defined earlier – make sure it’s a double _ underscore)

subsites5

4. Params which will contain a single entry called parameters as type dictionary with the value of the JSONRequest variable.

subsites4

Great! Now that you’ve set all that up, create your App Step and add a “Call Web Service” action inside of it. (If App Step is greyed out, please see here again)

You’re going to want to click on the word “this” and insert the variable RESTUri in the “Enter the HTTP web service URL” spot and then specify this as a POST method.

subsites6

Click ok, then click on the word “request” and select the Params variable you had created. Your call web service action should look like this now:

subsites7

Now, right click on the call web service action and click the properties button. Map the RequestHeaders to the variable you’d created, create a new dictionary called ResponseContent and map that to the ResponseContent.

subsites8

Then click OK.

I like to add a step after the web service call which logs the responseCode variable to the workflow history so I know if it was successful or not. When it’s not and I’m troubleshooting, I’ll create an e-mail task which then sends me the ResponseContent so I can get more information about the error that’s being thrown.

If you’re good, go publish your workflow and click OK at the prompt about the App Step.

This is what my workflow looks like at this point:

subsites10

Now go attach this new workflow to your list which should have columns for: Title, Description, and URL.

One other fun little tidbit which I’ll give thanks to Fabian Williams for is that the AppStep can be a little tricky. If your workflow is throwing “Unauthorized” errors, check out Fabian’s blog post.

What’s in a namespace?

If you have attended any of my sessions on beginning Client Side Development you might have heard me talk about ensuring that you do not promote the public namespace. For SharePoint/O365 Developers who are starting to learn client-side the concept of namespaces are common in C#, but not as common in JavaScript. The whole purpose of developing under a namespace is to ensure that your variables do not collide with other variables, resulting in broken functionality, errors, etc.

The approach that I take to address this is by creating a unique object and then working under that object as a pseudo-namespace.

For example:

var jaredApp = jaredApp || {};
jaredApp.GetData = function(){

console.log(“This functionality exists under jaredApp”);

}

You could follow that approach and build out your other functions – and then it is as easy as jaredApp.GetData(); to call the function. The benefit of this is all of your variables fall under jaredApp (or whatever you decide to call it), rather than binding to the window object which is where variables are assigned by default – this is also referred to as the global namespace.

There are a couple of other approaches such as the immediately invoked function execution (IIFE) which looks something like this:

(function(){
console.log(“This functionality exists temporarily within this invoked function”);
})();

The nice things about the IIFE is all variables created within it are scoped within that function therefore they would not be bound to the global namespace, which prevents collisions.

Happy SharePointing & JavaScripting!