Quantcast
Channel: Brian Smith's Microsoft Planner and Project Support Blog
Viewing all 200 articles
Browse latest View live

Project and Project Server August 2016 Public Updates

$
0
0

Thanks Suzanne for preparing and posting the Project Server Public Update post again this month – which you can all find at Project Server 2013 and 2016 August 2016 PU.  No 2010 updates this month, so don’t go looking for those (and if you are looking, then hopefully you are also looking to upgrade!

 

image


Project Server or Project Online: Please don’t use PROJ_UID as a TASK_UID

$
0
0

When creating Projects using PWA or Project Professional each task, and assignment will get a fresh GUID – which is unique.  Usually if using CSOM then new GUIDs would also be generated for each new entity, although it is possible that you can specify a specific GUID to be used.  It is technically possible to use the same GUID that you have used for the plan’s PROJ_UID and set it as a task’s TASK_UID.  However, please don’t.  In Project Web App when editing a plan (changing a custom field or description for example) the code we run on the server is checking to ensure that the GUIDs are all as they should be and finding that a GUID exists that is used for a type it is not expecting (as well as the type it is expecting of course) and you will get an error: An error has occurred when saving your project to the server. Please contact your administrator for assistance.

image

 

And in the queue you will find a Failed and Blocking Correlation job for the Project Update from PSI, and the error message will look something like this”

 

Datasets:

  • ProjectDataSet
    • Table Project
      • Row: PROJ_UID=’94a00658-4418-4851-9437-378fc226eb94′
        • Error ProjectNotFound (1000) – column


General

  • Queue:
    • GeneralQueueJobFailed (26000) – ProjectUpdate.ProjectUpdateMessage. Details: id=’26000′ name=’GeneralQueueJobFailed’ uid=’fae360d7-1560-e611-80ee-001dd8b73698′ JobUID=’5ac936d4-1560-e611-80ee-001dd8b73698′ ComputerName=’6f80c7c5-1fd9-447c-aa6d-f60e3e3ab4ec’ GroupType=’ProjectUpdate’ MessageType=’ProjectUpdateMessage’ MessageId=’2′ Stage=” CorrelationUID=’87df989d-ca7c-e0fd-0000-01df4be1e070′. For more details, check the ULS logs on machine 6f80c7c5-1fd9-447c-aa6d-f60e3e3ab4ec for entries with JobUID 5ac936d4-1560-e611-80ee-001dd8b73698.

 

ProjectNotFound is the error returned from our code that is validating that everything GUID-wise is as it should be – and in this case it finds that the GUID used for the plan is also used for one of the tasks.

The best resolution (after stopping using the same GUID in multiple places for any new plans) is to open the affected plan in Project Professional (the desktop client) and delete this task and re-create it – which will get a brand new unique and distinct GUID.  Probably easier to create the new task and then delete the bad one – just so you can see what values you need to set.

PPM User Voice and Team Assignments–we heard you!

$
0
0

A reminder about the User Voice site for Office 365 Project & Portfolio Management – and one of the recent feedback items that has moved to “Started” – https://microsoftproject.uservoice.com/forums/218133-office-365-project-portfolio-management-ppm/suggestions/11573715-allow-multiple-team-resources-to-apply-for-a-task.  Allow multiple team resources to apply for a task that has been assigned to a team resource (allow 1:n replacement for team resources).  I am pleased like many of you that there are some enhancements coming to this feature – it is certainly one feature, where since its release, customers have expected it actually did far more than it really did. A case where each customer had written their own specification for what they wanted it to do and were disappointed with the actual implementation. Great to see this getting some attention – and a good example of how your suggestions and votes on https://microsoftproject.uservoice.com/forums/218133-office-365-project-portfolio-management-ppm can really make a difference.

image

Don’t forget, Microsoft Planner also has its User Voice page – https://planner.uservoice.com/forums/330525-microsoft-planner-feedback-forum if you have suggestions or just want to see what other ideas are out there and which ones we are thinking about or have started work on.

Project Online: How do I find who is using Project Online?

$
0
0

Chris Boyd recently responded to the User Voice thread around the “Last Connect” column no longer shown in Project Online topic with a suggestion on how one might replace this functionality using other features already available in Project Online and Office 365.  The thread is here – https://microsoftproject.uservoice.com/forums/218133-office-365-project-portfolio-management-ppm/suggestions/14773167–last-connect-column-no-longer-shown-in-project-o but I’ll also publish here and I’m adding a few screen shots as I’ve already walked through these steps on my Project Online tenant.  Over to Chris first, but read on to the end as I have a few comments and thoughts to add.

 

First, I would like to provide a little more context on why the last connected date was removed from the admin user page in PWA. Unfortunately keeping the last connected date current was an expensive operation that was impacting end users in unexpected ways. We had several service escalations which lead us to removing this column. It was negatively impacting several experiences:

• Logging into the PWA site
• Moving between pages within PWA
• Project Center load time

At this point in time we do not have plans to add it back. We will continue to monitor the need for organizations to have this information. In the meantime, here is a workaround to capture who is accessing the PWA site:

1. Configure audit settings for a site collection for the PWA site: https://support.office.com/en-us/article/Configure-audit-settings-for-a-site-collection-a9920c97-38c0-44f2-8bcb-4cf1e2ae22d2?ui=en-US&rs=en-US&ad=US Make sure to check the “Searching site content” checkbox.

image

image

 

2. Add a “Search-Driven Content” web part to the PWA home page or any other PWA web part page to track the usage of it.

a. Click on the Gear in the top right corner
b. Edit page
c. Add a Web Part
d. Categories “Search-Driven Content”  “Recommended Items”

image

If you don’t want the web part to be visible, continue with the following otherwise go to step 3.

e. On the “Recommended Items” web part, click the arrow in the top right of the web part
f. Edit Web Part
g. Expand Layout and check Hidden
h. Click Ok

3. Stop editing the page

At this point, anytime someone accesses the home page it will be logged. To generate a report:

1. Go to: https://MY_SPO_URL/PWA_URL/_layouts/15/Reporting.aspx?Category=Auditing

image

2. Run a custom report
3. Make sure to check “Searching site content”

image

The report that is generated should give you who has access the PWA site. It is important to note that it will not capture Project Client or reporting from OData usage.

Chris Boyd
Principle PM Manager
Microsoft Project Team

The report is an Excel file which has two tabs – Audit Data – Table, and Report Data 1.

 

image

image

Obviously this will only register for the specific page – so you may need to add to other pages if the home page is not normally used – and as Chris mentions – this will not register usage from the OData feeds, or clients such as Project Professional, Project Online Desktop Client or the iPad Portfolio App.  The Report Data 1 page gives User Id and when an access occurred, so you could build up a Power BI dataset by importing the Excel data (the _data table when using Get Data from Excel), and likely just the User ID (Suitably trimmed) and the Occurred (GMT) column – and updated for your time zone if you wanted.  Not thought through all the logic yet on automation options.  You’d need a Date table too.  If anyone’s already done something similar then feel free to share.

The other question I’m already hearing someone ask – even before posting this – is how do I know someone hasn’t logged in?  The best way to handle this would be to use PowerShell to get a list of your users who have a license for Project Online – and use this in the same Power BI model as the full list of potential users.  You could then link this to the Report Data 1 output and show the most recent date for each User ID, and choose the option to also show items with no values so that you could see who hadn’t logged in.  You would only be seeing data from now forwards – so no history – but this does give you the potential to get so much more information on usage patterns than just the unreliable ‘last connect date’ which would actually register a connection even if the user failed to log in.

Project and Project Server September 2016 Updates Released

$
0
0

I just posted the latest release information for on-premises Project and Project Server – both 2013 and 2016 – and even 1 fix for Project 2010!  Take a look over on the Project Support blog at https://blogs.technet.microsoft.com/projectsupport/2016/09/14/project-server-2010-2013-and-2016-september-2016-pu/ 

Some points to note for Project Server 2016 public update for September – posted as known issues on the KB:

  • If you install the September 2016 public update for SharePoint Server 2016 and then perform a license upgrade from Standard to Enterprise, the license upgrade will time out before completing. This causes the license upgrade operation to fail.
  • If you want or need to perform a SharePoint 2016 license upgrade, you must complete the license upgrade before installing the September 2016 public update.
  • This issue will be fixed in a future public update.
  • Patching and upgrading the September 2016 public update for SharePoint Server 2016 may take three times longer than in previous public updates. Please be patient while the patching and upgrading operations are completing.

So if you are planning on making a license change then either do it before patching or wait for the fix which will be in a future update.  The last point – this update is SLOW.  Probably took about 6 or 7 hours on my non-production grade VM – hopefully you are running on better kit than my VM – but be prepared for a longer update than usual.

I’ve also updated the lists of fixes that I have on the Project Support blog – take a look at these for a quick way to see all the fixes for each version since we shipped – along with the version numbers of the client patches and the database versioning for the server products.

Project Online non-EXO notifications and October 2016 client updates

$
0
0

I often find that Friday is a great time to catch up on a few things – and often that means preparing and posting a blog when almost everyone else has finished for the weekend, or in Hawaii, American Samoa or French Polynesia – and probably finished last Friday or before… (Apologies to anyone who does actually work in those places – and I know that I have many readers there who can’t all be on holiday and sad enough to still read my blog!)

 

imageimage

But on to the real topic – and in case you did miss it on Friday I blogged about a feature that has just been turned on allowing alerts and reminders to go to the defined ‘work e-mail’ of users that do not have an Exchange Online account.  See Project Online- Notifications even if you don’t have Exchange Online for the full details.

Tomorrow is the release day for Project Server updates (2nd Tuesday of the month) and this week the server bits will also be joined by the client bits – as we had a late breaking problem with Project clients fixes last week so we re-fixed and will be releasing this month on 2nd Tuesday rather than the usual 1st Tuesday.  I understand this also means they will only be available via download rather than via automatic updates.  Look out for the update mail tomorrow or Wednesday – depending how early in the day we see the release.

And a final point – Paul Mather followed up on his novel solution to the missing ‘Last Logon’ problem with part 1 of a blog post on how to report against his Javascript populated list – see https://pwmather.wordpress.com/2016/10/10/last-logon-time-for-the-projectonline-pwa-users-report-ppm-powerbi-powerquery-office365-sharepoint-bi-part-1/ for the details – and look out for part 2.

Project and Project Server October Updates released

Project and Project Server November 2016 Updates released


Microsoft Teams and Microsoft Planner–what to expect

$
0
0

With the release of Microsoft Teams there is a new experience with Microsoft Planner – plans are one of the items that can be added within a team – and you can add multiple plans!  There are a few differences between the way these ‘Team Plans’ behave – but they are real ‘Planner Plans’ and are stored in exactly the same service.  First a little background that will help understand why they behave as they do…

With Planner currently there is a one to one relationship between Plans and Groups.  Create a Plan and it creates a Group – Create a Group and it creates a Plan.  Same with Teams – create a Team and it creates a Group – which creates a Plan.  So how do the plans created within Teams fit in to all this?  I’ll walk you through a scenario…

I create a Team in my Office 365 tenant (after switching on Teams in Admin, Settings, Services and add-ins – then selecting Teams and enabling…) called BSPJO_Team:

image

Once I have done this I also see this as a Group in Outlook in Office 365:

image

And in Planner – this Plan is also available:

image

Now if I add a new Plan via my BSPJO_Team called My PJO Blog Team:

image

image

The first thing you will notice is that this doesn’t prompt for an e-mail address to associate with the team.  As this is within a Team it does not get its own e-mail and associated SharePoint site (so conversations and attachments are also not available directly in the plan – but of course at the Team level you have these capabilities anyway). 

Once I create this plan I’m going to add a task called My Blog Team Task and assign myself:

image

If I now go to Planner I do not see this new plan in my Planner hub:

image

But I do see the assigned task in My tasks:

image

I can work on this task in either environment – and after setting a category in Planner:

image

I can see the same category set in Teams:

image

This is all expected behaviour, and the piece that is a little confusing is that the My PJO Blog Team plan does not show in the Planner hub.  This is actually filtered out as the plan does not have the usual one to one relationship to Groups – but could be one of many under the BSPJO_Team Group.  This will change at some point as we move towards having a many to one relationship to Groups outside of the Microsoft Teams scenario.

One other condition I will make you aware of – the synchronization of differences between plans, tasks etc. happens regularly to ensure changes other people make are visible to you – but there is a limit per user of 4 ‘diff sync’ processes – and the addition of Planner to Teams means that this is another ‘client’ that may be syncing.  So once we also release other clients (iOS, Android and Windows) then you may reach this limit if you also have a laptop browser or two open.  One of these will get an error (not sure of the exact text) which indicates you cannot be serviced at this time.

I hope this helps to understand the current behaviour of plans within teams.

Project Online: How to handle the sku changes?

$
0
0

In August 2016 we announced some changes to the Project Online plans – https://support.office.com/en-us/article/Important-information-for-Project-Online-customers-about-plan-changes-496aafdd-9f62-4daf-8d2e-0e700925c6b2?ui=en-US&rs=en-US&ad=US and now we are in 2017 more of you will be either starting or renewing to these plans.  This can be a challenge to swap the old sku for the new sku – particularly when these are not necessarily a one to one mapping for your users.  A quick note on terminology around skus and plans (and I don’t mean project plans – or for that matter Planner plans…).

First however, lets just recap what changed.

We retired Project for Office 365, Project Online and Project Online with Project for Office 365 (we had dropped the ‘Pro’ part of the client name earlier in the year).  Project Lite was renamed to Project Online Essentials.  The new plans were Project Online Essentials (the renamed Project Lite) as well as two new skus that both include the Project Online Desktop Client (the new name for Project for Office 365) – Project Online Professional and Project Online Premium.  The article mentioned above covers more details about these different plans and your options for renewing but basically the only straightforward change is Project Lite to Project Online Essentials.  If you have purchased just Project for Office 365 then most likely you would switch to Project Online Professional – and if you have Project Online or Project Online with Project for Office 365 then you have a choice.  If your users need access to the Portfolio features, demand management or the new Resource Engagement features then they should move to the Project Online Premium – otherwise Project Online Professional will be enough.  Easy?

The main focus of this blog post is to help guide you through this process – and it breaks down into 7 steps:

  1. What do you currently have in terms of users and skus?  We have some PowerShell to help!
  2. What are your options? So read the article mentioned above and also the Service Description – which covers the capabilities for each sku in more detail.
  3. Identify which users need which new skus based on analysis of 1 and 2.
  4. Buy or renew to the new skus as required
  5. Assign the new sku/licenses to your users.  We have some PowerShell for that too!
  6. Ensure you don’t have any users on the old skus.
  7. Cancel any remaining old skus.

A quick note on terminology around skus and plans (and I don’t mean project plans – or for that matter Planner plans…).  A sku usually refers to the top level item that you purchase by subscription – like Office 365 Enterprise E3, or Project Online with Project for Office 365 – but has a different name – like BriSmithPJO:ENTERPRISEPACK and BriSMithPJO:PROJECTONLINE_PLAN_2 for the two examples I have given (BriSmithPJO is my tenant name).  Then plans are items that come with the sku and are the level that you tend to make license assignments.  For example the sku BriSMithPJO:ENTERPRISEPACK has plans that represent Flow, PowerApps, Sway, Planner and many more – and BriSMithPJO:PROJECTONLINE_PLAN_2 has plans of SharePointWAC, SHAREPOINTENTERPRISE, PROJECT_CLIENT_SUBSCRIPTION and SHAREPOINT_PROJECT.  From the Project Online perspective these are the old skus and plans – but happen to be the ones on my trial tenant.  For comparison with the new names this table should help:

 

List of possible old SKUs

List of possible New SKUs

PROJECTONLINE_PLAN_1

PROJECTONLINE_PLAN_1_STUDENT

PROJECTONLINE_PLAN_1_FACULTY

PROJECTONLINE_PLAN_2

PROJECTONLINE_PLAN_2_STUDENT

PROJECTONLINE_PLAN_2_FACULTY

PROJECTCLIENT

PROJECTCLIENT_FACULTY

PROJECTCLIENT_STUDENT

PROJECTPREMIUM

PROJECTPROFESSIONAL

PROJECTPREMIUM_STUDENT

PROJECTPROFESSIONAL_STUDENT

PROJECTESSENTIALS_STUDENT

PROJECTPREMIUM_FACULTY

PROJECTPROFESSIONAL_FACULTY

PROJECTESSENTIALS_FACULTY

 

To help with steps 1 and 5 we have some PowerShell, and my colleague Matt Byrd has put together a great script that can report on current usage and output a cool CSV file that you can play around with in Excel to see which users have which licenses – then the same script can handle switching licenses or adding new skus to users.  For those who might like to understand how all this happens I’ll start with a couple of basic scripts to group users by sku and skus by user – then unleash Matt’s script.

All these scripts require you to connect to your tenant from PowerShell – so you will need to have installed the required modules – and this article https://support.office.com/en-us/article/Managing-Office-365-and-Exchange-Online-with-Windows-PowerShell-06a743bb-ceb6-49a9-a61d-db4ffdf54fa6 is a good one to review to make sure you have the right modules.

Some simple stuff first – and then I’ll introduce a script form the script gallery that will help you.  If you just want to see which users have which sku license – and which sku is licensed to which users – you could simply run the following PowerShell – which outputs to the screen but you could redirect to a file.  It will prompt you to login to your tenant – and you will need to be an admin with access to the licensing options in Office 365 admin portal.

#  Throw up a login window – log in to tenant
Connect-MsolService

# Get Users and see which sku they have assigned

$Users = Get-MsolUser

ForEach($User in $Users){
     Write-Host $User.DisplayName ” has the following Sku(s)”
     Write-Host $User.Licenses.AccountSkuId
}

#Get Skus and see which users are assigned

$Skus = Get-MsolAccountSku

ForEach($Sku in $Skus){
     Write-Host $Sku.AccountSkuId
     $SkuUsers = Get-MsolUser | where {($_.Licenses.accountskuid -Like $Sku.AccountSkuId)}
     ForEach($SkuUser in $SkuUsers){
           Write-Host $SkuUser.DisplayName
     }
}

My colleague from the Exchange Online support team, Matt Byrd, came up with a very comprehensive script which you can get from the script gallery at https://gallery.technet.microsoft.com/Manage-your-O365-Licenses-b23ccd16

image

In Matt’s words, this script:

Helps automate common license management tasks by providing a simpler set of switches on existing PowerShell Cmdlets.

  • Add a new SKU.
  • Replace one SKU with another.
  • Provides a picker for chosing a SKU if not specified.
  • Disable plans within a SKU when adding or replacing.
  • When making changes only operates on a specified set of users.
  • Comprehensive report of assigned SKUs and enabled plans

You can use Get-Help on this command too to get the usage and examples – and the first thing that will help with step 1 above will be the –Report option.  The full command, including agreeing to the disclaimer in the script (You should read this first of course) and setting an output log file would look like this – and I also show the output I see from my test tenant (I was already connected (Connect-MsolService)

 

PS C:\PowerShell> .\Manage-MSOLLicense.ps1 -IAgreeToTheDisclaimer -Report -LogFile .\BlogReport.log
[1/20/2017 9:16:29 AM] – Created 01/10/2017
[1/20/2017 9:16:29 AM] – Agreed to Disclaimer
[1/20/2017 9:16:29 AM] – Reporting Mode
[1/20/2017 9:16:29 AM] – Generating Sku and Plan Report
[1/20/2017 9:16:30 AM] – Found 3 SKUs in the tenant
[1/20/2017 9:16:30 AM] – Found 17 Unique plans in the tenant
[1/20/2017 9:16:30 AM] – Getting all users in the tenant
[1/20/2017 9:16:30 AM] – Found 56 users in the tenant
[1/20/2017 9:16:31 AM] – Finished 25 Users
[1/20/2017 9:16:31 AM] – Finished 50 Users
[1/20/2017 9:16:31 AM] – Exporting to .\License_Report_20170120.csv
[1/20/2017 9:16:31 AM] – Report generation finished; exiting

The requested logfile also contains the output shown above – but the really interesting piece is the csv file that is created.  Assuming you have Excel installed then you can just open the csv file and see the report.  Using the Sort and Filter, Filter option – and setting the column widths you then have a neat report like this:

image

I’ve filtered my user list and shrunk some of my columns to make things fit – but you can see in Column C the sku that is assigned – and then columns D to T show the plans available under those skus and their status.  Success means it is licensed and provisioned – Pending  Provisioning means it is licensed but probably hasn’t been used by that user yet – and Disabled means that specific plan is disabled for that user (More on disabled plans later…)

From there it is pretty easy to select just users having specific skus to see who needs to have their license changed (I just have the ‘old’ licenses in my test trial tenant – so not a great example).  Once you know who needs to have their licenses changed you can use Matt’s tool with the –NewSKU and –ExistingSKU parameters set to swap out the old for the new SKU.  Again Get-Help has some guidance here – but an example of using the script to achieve step 5 above might be:

C:\PS>$Users = Get-MsolUser | where { ($_.Title -eq “Staff”) -and ($_.Licenses.accountskuid -notlike “*PROJECTPREMIUM”)}
      
  .\Manage-MSOLLicense.ps1 -IAgreeToTheDisclaimer -Users $Users -LogFile c:\temp\license.log -NewSKU company:PROJECTPREMIUM -ExistingSKU
    company:PROJECTONLINE_PLAN_2 -DisabledPlans “SWAY,FLOW_O365_P2”

This command reads all your users who have a specific Title (in this case “Staff”) who do not already have the PROJECTPREMIUM sku.  It then uses this array of users (held in the $Users variable) and passes this through to the Manage-MSOLLicense.ps1 script.  The script agrees to the disclaimer, sets the log location and then sets the new SKU as PROJECTPREMIUM which will replace the old SKU PROJECTONLINE_PLAN_2.  This example also shows the ability to use the –DisabledPlans parameter to disable SWAY and FLOW (Just an example – I have nothing against either SWAY or FLOW – they are really cool!)

Of course your situation may not be this simple – but you can use many filter options to select your users – it will really depend on individual circumstance and the number of users you manage for you to choose the best options.

Once you have re-assigned licenses you could run the report again to ensure everything is as you expected – and use the fresh report to be sure no one is on the old skus before you cancel them.

A couple of gotchas I should point out.  First there are situations where having the same plan under different skus can cause errors.  This was an issue with the old skus and the SHAREPOINT_PLAN_2 plan – which can be selected either under the ENTERPRISEPACK or the PROJECTONLINE_PLAN_2 but not both.  If you tried to use this script to set such a condition it would error out unless you used the –DisabledPlans option.  This should not be an issue assigning the new skus.

The 2nd gotcha relates to the use of the –DisabledPLans option and if you read my blog regularly (who wouldn’t!) then you will have seen me talk about this before in regard to Planner.  Using DisabledPlans option will disable just the plans mentioned – but could then re-enable plans you hadn’t considered – or potentially if there was a new plan you were unaware of and hadn’t included it in DisabledPlans then it would get enabled. We certainly want to encourage you to use all the features of Office 365, but I do appreciate that from an administrative side you may sometimes want to limit licenses while you evaluate new features.  I’m keen that you don’t get any surprises.  For more details on this topic see my Planner blog – https://blogs.msdn.microsoft.com/brismith/2016/06/24/microsoft-plannerdisabling-planner-license-without-enabling-other-licenses/.  For an interesting approach to Office 365 license management it is also worth reading Matt McNabb’s blog series and part 4 deals with disabled plans – https://mattmcnabb.github.io/Office-365-Licensing_4.

I know this is a complex area but hopefully with the script from Matt Byrd and the CSV file you have all the information you need to decide how to move forward with the new skus and how to re-assign the licenses.  Support is always happy to help too – just get your tenant admin to open an incident if you need some assistance.  Or ask a question on the blog.

Microsoft Planner: How to clone a Plan with Graph

$
0
0

(This is the zip file of the PowerShell script – plannercloneblog)

One common request we already have in the roadmap for Microsoft Planner is to support templates – but this will be a few months yet.  I wanted to find my way around Graph and what I could do with Plans and tasks – so thought cloning an existing plan might be a good thing to show.  This isn’t production ready code – really just a step-by-step using PowerShell to read and write the various entities in Planner.  This is all based on the Beta Graph for Planner – https://graph.microsoft.io/en-us/docs/api-reference/beta/beta-overview and when this goes to General Availability – hopefully this quarter – I will make the necessary edits.  There may be some slight changes to the endpoints.  Thanks to one of our MVP’s – Jakob Gottlieb Svendsen – mailto:jgs@coretech.dkhttp://blog.coretech.dk/jgs as I stole used some of his code from his example Graph scripts published at https://www.powershellgallery.com/packages/MicrosoftGraphAPI/0.1.3/Content/MicrosoftGraphAPI.psm1 to get the authentication tokens.

The first part of the walk-through shows creating a simple Plan – then I’ll move on to the cloning.  Follow along with the documentation linked above – so that the endpoints and requests make more sense.

For any application to talk to Graph it will need some permissions – and these are controlled by creating an AppId in Azure and setting the required access levels – then this AppId is passed in when requesting the authentication token.  There are a couple of ways of doing this – one through the Admin portal of Office 365 and then Admin Centers and Azure AD – but the one I will walk through is directly in the Azure portal (https://portal.azure.com).  Either way you should be able to follow the steps.

In the Azure Portal select Azure Active Directory, then App Registrations and you should end up with something like this (you may or may not see existing App Registrations):

image

 

I’m going to click Add – then enter my details and click Create:

image

This just takes few seconds then I can see my AppId:

image

Clicking on the BlogAppId takes me to the details and I can then set myself as the owner, and add the Required Permissions.  While I’m on that page I can also copy the Application ID as I will need that in my PowerShell script:

image

I’ll skip the screenshots adding me as owner – that is pretty straightforward – and go to Required Permissions.  One permission is already set – sign in and read user profile, you need to select the additional permissions of read and write all groups and read and write directory data – then click Save.

image

Once these permissions are saved you can Grant them – using the Grant Permissions option in the header:

image

That’s all we need to do for the Azure side of the house – now we can get on to the more interesting stuff and open the Microsoft Azure Active Directory Module for PowerShell ISE.  I already have the AAD module loaded and the MSOL stuff.  I’ll walk through the script and explain some of it as step through and show results – but the full script is attached too.  No error detection and probably will fail if you run it multiple times as I don’t initialize everything – but just meant to help you find your way around Planner using Graph.

The first thing I do after making sure that my call to Jakob’s Get-GraphAuthToken is in scope to set some variable and get my token:

# Blog Client ID – my Application ID from Azure
$clientId = ’50d344ab-fd8a-4cbe-93a7-29cdb8949a71′

#myId – you can pull this from Graph
$myId =  “cf091cb1-dc23-4e12-8f30-b26085eab810”

$tenant = “brismithpjo.onmicrosoft.com”

$token = Get-GraphAuthToken -AADTenant “brismithpjo.onmicrosoft.com” -ClientId $clientid -RedirectUri “http://brismithpjo.sharepoint.com” -Credential (get-credential)

This pops up a login – so I log in to my Contoso demonstration tenant.

To create a new Plan – first I need to create a Group, then add myself as a member of that Group and then I can can create the Plan with the Group as the owner of the Plan.  For the group creation I will make a POST call to https://graph.microsoft.com/beta/groups with a request containing the required properties in json format, and a header containing the authorization (with the access token from the earlier call) as well as the content type and content length.

#Create a Group

$Request = @”
{
“description”: “BlogGroup”,
“displayName”: “BlogGroup”,
“groupTypes”: [
“Unified”
],
“mailEnabled”: true,
“mailNickname”: “BlogGroup”,
“securityEnabled”: false
}
“@

$headers = @{}
$headers.Add(‘Authorization’,’Bearer ‘ + $Token.AccessToken)
$headers.Add(‘Content-Type’, “application/json”)
$headers.Add(‘Content-length’, + $Request.Length)

$group = Invoke-WebRequest -Uri “https://graph.microsoft.com/beta/groups” -Method Post -Body $Request -Headers $headers

 

I’m returning my $group object – and this contains stuff I need when adding myself as a member and also creating the plan.  By selecting $group in the ISE and executing I see the following:

PS C:\> $group
StatusCode        : 201
StatusDescription : Created
Content           : {“@odata.context”:”https://graph.microsoft.com/beta/$metadata#groups/$entity”,”id”:”a496
8242-6b41-4afa-a93c-bd0a49beda86″,”classification”:null,”createdDateTime”:”2017-02-17T23
:20:28Z”,”description”:”…
RawContent        : HTTP/1.1 201 Created
Transfer-Encoding: chunked
request-id: 0ca2fc60-c744-4adc-9a09-be35b5a5ef3b
client-request-id: 0ca2fc60-c744-4adc-9a09-be35b5a5ef3b
x-ms-ags-diagnostic: {“ServerInfo”:{“DataCe…
Forms             : {}
Headers           : {[Transfer-Encoding, chunked], [request-id, 0ca2fc60-c744-4adc-9a09-be35b5a5ef3b],
[client-request-id, 0ca2fc60-c744-4adc-9a09-be35b5a5ef3b], [x-ms-ags-diagnostic,
{“ServerInfo”:{“DataCenter”:”West Central
US”,”Slice”:”SliceB”,”ScaleUnit”:”002″,”Host”:”AGSFE_IN_1″,”ADSiteName”:”WCU”}}]…}
Images            : {}
InputFields       : {}
Links             : {}
ParsedHtml        : mshtml.HTMLDocumentClass
RawContentLength  : 702

The status code should be checked in production code to ensure the right response was received.  As I am working in PowerShell I found it easier to handle PowerShell objects than json – so the following code enabled me to get the returned content into something more manageable.  I use this technique for most of the calls where I want to use the content.  In this case I’m creating my PowerShell object $grouContent and then getting a couple of the properties for later use – the ID and also the displayname.

$groupContent = $group.Content | ConvertFrom-Json

$groupId = $groupContent.id
$groupDisplayName = $groupContent.displayName

To add myself as a member of the group I will be using $myId (hard coded at the top of the script) in the request – and then the $groupId variable is part of the $uri endpoint for the call to https://graph.microsoft.com/beta/groups/{id}/members/$ref – note the need to ‘escape’ the $ref with the ` character.  Again, this is a POST.

$Request = @”
{
“@odata.id”: “https://graph.microsoft.com/beta/directoryObjects/$myId”
}
“@

$headers = @{}
$headers.Add(‘Authorization’,’Bearer ‘ + $Token.AccessToken)
$headers.Add(‘Content-Type’, “application/json”)
$headers.Add(‘Content-length’, + $Request.Length)

$uri = “https://graph.microsoft.com/beta/groups/” + $groupId + “/members/`$ref”

Invoke-WebRequest -Uri $uri -Method Post -Body $Request -Headers $headers

Once I have the Group and am a member I can create my new Plan.  The $groupId is the owner of the Plan – and I am using the same name for the Plan as the Group.  We will be supporting multiple Plans per Group at some point – in the way it is already implemented in Teams – but for now this is 1:1.  Nothing much different in this call to https://graph.microsoft.com/beta/plans – again a POST with the request and header set as you can see.  I’m pulling the Content of the returned object into a PowerShell object again – and pulling out the $planId as I will need that when I add my buckets.  One thing to note here is that if you do this too quickly after adding yourself as a member of the group you may get a 403 rather than the desired 201 as the status code – which indicates that it doesn’t yet know that you are a member.

$Request = @”
{
“owner”: “$groupId”,
“title”: “$groupDisplayName”
}
“@

$headers = @{}
$headers.Add(‘Authorization’,’Bearer ‘ + $Token.AccessToken)
$headers.Add(‘Content-Type’, “application/json”)
$headers.Add(‘Content-length’, + $Request.Length)
$plan = Invoke-WebRequest -Uri “https://graph.microsoft.com/beta/plans” -Method Post -Body $Request -Headers $headers
$planContent = $plan.Content | ConvertFrom-Json
$planId = $planContent.id

Now we have a Plan (always good to have a Plan) so we can add a bucket.  Nothing new here – apart from the different endpoint – and you can see I used the $planId in the request.  The orderHint is a string that Planner uses to position things in lists.

$Request = @”
{
“name”: “BlogBucket”,
“planId”: “$planId”,
“orderHint”: “BlogBucket”
}
“@

$headers = @{}
$headers.Add(‘Authorization’,’Bearer ‘ + $Token.AccessToken)
$headers.Add(‘Content-Type’, “application/json”)
$headers.Add(‘Content-length’, + $Request.Length)
$bucket = Invoke-WebRequest -Uri “https://graph.microsoft.com/beta/buckets” -Method Post -Body $Request -Headers $headers
$bucketContent = $bucket.Content | ConvertFrom-Json
$bucketId = $bucketContent.id

Next I can add a task to the bucket – using the $bucketId from the previous call, and I am also setting the assignedTo to $myId – so that I am assigned to the task.  I get the Task ID in case I want to do other things with the task – but for now this is all I’m going to do with this Plan.

$Request = @”
{
“assignedTo”: “$myId”,
“planId”: “$planId”,
“bucketId”: “$bucketId”,
“title”: “Blog Task”,
“orderHint”: “Blog Task”
}
“@

$headers = @{}
$headers.Add(‘Authorization’,’Bearer ‘ + $Token.AccessToken)
$headers.Add(‘Content-Type’, “application/json”)
$headers.Add(‘Content-length’, + $Request.Length)
$task = Invoke-WebRequest -Uri “https://graph.microsoft.com/beta/tasks” -Method Post -Body $Request -Headers $headers
$taskContent = $task.Content | ConvertFrom-Json
$taskId = $taskContent.id

Going in to Planner I can see my new Plan, with its bucket and task – assigned to me.  So far so good!

image

Next we can try a clone.  For this I created a Plan called ‘Template’ and set buckets, tasks, assignments, descriptions, checklist and categories.  The aim is to create a new plan that has all these same values set.  In this case I’m not picking up any dates – but in the real world you could potentially choose a start date and use the date relationships in your ‘template’ to drive the new dates.  Here is my template:

image

My first piece of PowerShell reads through all my Plans and finds the one called “Template”.  This uses a GET and has no request set.  I iterate through my collection of Plans by getting the returned $plans.Content into a PowerShell object as before – then the collection is the .Value property of the Content.  Then I’m just comparing the $plan.title to find what I’m looking for.  If you have many plans you might need to consider that the returned $plans would be paged – I’m ignoring that here as I know I don’t have that many plans.  If you have many plans there is probably a better way to find your template.

$headers = @{}
$headers.Add(‘Authorization’,’Bearer ‘ + $Token.AccessToken)
$headers.Add(‘Content-Type’, “application/json”)
$plans = Invoke-WebRequest -Uri “https://graph.microsoft.com/beta/me/plans” -Method Get -Headers $headers
$plansContent = $plans.Content | ConvertFrom-Json
$planValue = $plansContent.Value
ForEach($plan in $planValue){
If($plan.title -eq “Template”){
$templateId=$plan.id
$groupId=$plan.owner
Break
}
}

I have my $templateId so I can read my plan and also the $groupId which I will need for my members.  The next piece of code is a bug chuck – basically reading out all the bits of my plan I am interested in.  These are all using GET’s and anything that I put in a Value variable is a collection.  In some cases I show some of the values – just so you can see what is going on.  For task details I am creating an array so I can keep track of the details like checklists inside each task in my collection.  Again, if you were coding in something different than PowerShell (or just know more than I do) then just keeping the json might be easier.  I did also pull the checklists into arrays – but found later that just using the json when re-creating was the easier way.  Finally I get the members from the group – as I want to add the same members to my new group.

#################################################
# Read Template
# Get buckets
#################################################

$uri = “https://graph.microsoft.com/beta/plans/” + $templateId + “/buckets”

$buckets = Invoke-WebRequest -Uri $uri -Method Get -Headers $headers
$bucketsContent = $buckets.Content | ConvertFrom-Json
$bucketsValue = $bucketsContent.value

#################################################
# Get tasks
#################################################

$uri = “https://graph.microsoft.com/beta/plans/” + $templateId + “/tasks”

$tasks = Invoke-WebRequest -Uri $uri -Method Get -Headers $headers
$tasksContent = $tasks.Content | ConvertFrom-Json
$tasksValue = $tasksContent.value

$tasksValue[6].appliedCategories | Get-Member
$tasksValue[6].bucketId

#################################################
# Get task details
#################################################

Clear-Variable [array]$taskDetailsContent

ForEach($task in $tasksValue){
$uri = “https://graph.microsoft.com/beta/tasks/” + $task.id + “/details”

$taskDetails = Invoke-WebRequest -Uri $uri -Method Get -Headers $headers
[array]$taskDetailsContent += $taskDetails.Content | ConvertFrom-Json
}

$taskDetailsContent[6].checklist

#################################################
# Just for reference – not using the arrays returned
#################################################

ForEach($clist in ($taskDetailsContent[6].checklist | Get-Member -MemberType NoteProperty)){
[array]$checklistNames +=$clist
}
ForEach($itemName in $checklistNames){
[array]$checklistItems += $taskDetailsContent[6].checklist.($itemName.Name.ToString())
}

#$taskDetailsContent[6].checklist | Get-Member -MemberType NoteProperty

#################################################
# Get plan details
#################################################

$uri = “https://graph.microsoft.com/beta/plans/” + $templateId + “/details”

$planDetails = Invoke-WebRequest -Uri $uri -Method Get -Headers $headers
$planDetailsContent = $planDetails.Content | ConvertFrom-Json

#################################################
#Get Group Members
#################################################

$uri = “https://graph.microsoft.com/beta/groups/” + $groupId + “/members”

$members = Invoke-WebRequest -Uri $uri -Method Get -Headers $headers
$membersContent = $members.Content | ConvertFrom-Json
$membersValue = $membersContent.value

So that is the reading part done – next is the writing – and this starts of as before – create a Group, Add the members (which must include me) and then wait for a bit… and then create the Plan.  Once the Plan exists I add the plan details – basically the categories – which are the coloured fly-outs.  This uses a PATCH call and there is a new element in the header too – $headers.Add(‘If-Match’, $planContent.’@odata.etag’) so it knows what I am updating.  From there it is just a bunch of loops going through and adding the buckets, adding the tasks in the buckets and the details in the tasks – such as the applied categories, the description and checklist items.  For the checklist I swapped out the GUID for a new one – but this isn’t imperative.  Just habit – and also I was proud I’d found a way to swap out the GUID using REGEX and wasn’t about to leave it out after all that effort!  As you are running the code yourselves you can look at the objects to see what they contain – this blog will get a bit long if I try to show every detail.  Scroll down to the bottom to see how this all ended up.

#################################################
#Create our clone
#################################################
# First create the Group and add all members
#################################################

$Request = @”
{
“description”: “BlogClone”,
“displayName”: “Blog Clone”,
“groupTypes”: [
“Unified”
],
“mailEnabled”: true,
“mailNickname”: “BlogClone”,
“securityEnabled”: false
}
“@

$headers = @{}
$headers.Add(‘Authorization’,’Bearer ‘ + $Token.AccessToken)
$headers.Add(‘Content-Type’, “application/json”)
$headers.Add(‘Content-length’, + $Request.Length)

$group = Invoke-WebRequest -Uri “https://graph.microsoft.com/beta/groups” -Method Post -Body $Request -Headers $headers
$groupContent = $group.Content | ConvertFrom-Json

$groupId = $groupContent.id
$groupDisplayName = $groupContent.displayName

#################################################
# Adding members
#################################################

ForEach($member in $membersValue){

$newId=$member.id

$Request = @”
{
“@odata.id”: “https://graph.microsoft.com/beta/directoryObjects/$newId”
}
“@

$headers = @{}
$headers.Add(‘Authorization’,’Bearer ‘ + $Token.AccessToken)
$headers.Add(‘Content-Type’, “application/json”)
$headers.Add(‘Content-length’, + $Request.Length)

$uri = “https://graph.microsoft.com/beta/groups/” + $groupId + “/members/`$ref”

$result = Invoke-WebRequest -Uri $uri -Method Post -Body $Request -Headers $headers
}

# The member addition takes some time to be available to graph – might get a 403
Start-Sleep -s 30

#################################################
# Create the new plan
#################################################

$Request = @”
{
“owner”: “$groupId”,
“title”: “$groupDisplayName”
}
“@

$headers = @{}
$headers.Add(‘Authorization’,’Bearer ‘ + $Token.AccessToken)
$headers.Add(‘Content-Type’, “application/json”)
$headers.Add(‘Content-length’, + $Request.Length)
$plan = Invoke-WebRequest -Uri “https://graph.microsoft.com/beta/plans” -Method Post -Body $Request -Headers $headers
$planContent = $plan.Content | ConvertFrom-Json
$planId = $planContent.id

#################################################
# Add the plan details – categories (later)
#################################################

$cat0 = $planDetailsContent.category0Description
$cat1 = $planDetailsContent.category1Description
$cat2 = $planDetailsContent.category2Description
$cat3 = $planDetailsContent.category3Description
$cat4 = $planDetailsContent.category4Description
$cat5 = $planDetailsContent.category5Description
$Request = @”
{
“sharedWith”: {
},
“category0Description”: “$cat0”,
“category1Description”: “$cat1”,
“category2Description”: “$cat2”,
“category3Description”: “$cat3”,
“category4Description”: “$cat4”,
“category5Description”: “$cat5”
}
“@

$headers = @{}
$headers.Add(‘Authorization’,’Bearer ‘ + $Token.AccessToken)
$headers.Add(‘If-Match’, $planContent.’@odata.etag’)
$headers.Add(‘Content-Type’, “application/json”)
$headers.Add(‘Content-length’, + $Request.Length)

$uri = “https://graph.microsoft.com/beta/plans/” + $planId + “/details”

Invoke-WebRequest -Uri $uri -Method PATCH -Body $Request -Headers $headers

#################################################
# Iterate through the buckets – creating each
#################################################

ForEach($newBucket in $bucketsValue){
$newBucketName = $newBucket.name
$newBucketOrderHint = $newBucket.orderHint

$Request = @”
{
“name”: “$newBucketName”,
“planId”: “$planId”,
“orderHint”: “$newBucketOrderHint”
}
“@

$headers = @{}
$headers.Add(‘Authorization’,’Bearer ‘ + $Token.AccessToken)
$headers.Add(‘Content-Type’, “application/json”)
$headers.Add(‘Content-length’, + $Request.Length)
$bucket = Invoke-WebRequest -Uri “https://graph.microsoft.com/beta/buckets” -Method Post -Body $Request -Headers $headers
$bucketContent = $bucket.Content | ConvertFrom-Json
$bucketId = $bucketContent.id

$newBucket.id
$newBucket.name

Start-Sleep -s 3

ForEach($newTask in $tasksValue){

# Checking if the task is in this bucket

If($newTask.bucketId -eq $newBucket.id){

$newTaskAssignedTo = $newTask.assignedTo
$newTaskTitle = $newTask.title
$newTaskOrderHint = $newTask.orderHint
$newTaskPreviewType = $newTask.previewType
If(!$newTaskAssignedTo){
$Request = @”
{
“planId”: “$planId”,
“bucketId”: “$bucketId”,
“title”: “$newTaskTitle”,
“orderHint”: “$newTaskOrderHint”
}
“@
} else{
$Request = @”
{
“assignedTo”: “$newTaskAssignedTo”,
“planId”: “$planId”,
“bucketId”: “$bucketId”,
“title”: “$newTaskTitle”,
“orderHint”: “$newTaskOrderHint”
}
“@
}

$headers = @{}
$headers.Add(‘Authorization’,’Bearer ‘ + $Token.AccessToken)
$headers.Add(‘Content-Type’, “application/json”)
$headers.Add(‘Content-length’, + $Request.Length)
$task = Invoke-WebRequest -Uri “https://graph.microsoft.com/beta/tasks” -Method Post -Body $Request -Headers $headers
$taskContent = $task.Content | ConvertFrom-Json
$taskId = $taskContent.id

Start-Sleep -s 3

#################################################
# Set Applied Categories for the tasks
#################################################

$taskAppliedCategories = $newTask.appliedCategories |ConvertTo-Json

$Request = @”
{
“appliedCategories”: $taskAppliedCategories
}
“@

$headers = @{}
$headers.Add(‘Authorization’,’Bearer ‘ + $Token.AccessToken)
$headers.Add(‘If-Match’, $planContent.’@odata.etag’)
$headers.Add(‘Content-Type’, “application/json”)
$headers.Add(‘Content-length’, + $Request.Length)

$uri = “https://graph.microsoft.com/beta/tasks/” + $taskId

Invoke-WebRequest -Uri $uri -Method PATCH -Body $Request -Headers $headers

Start-Sleep -s 3

#################################################
# Set the description and checklist for the task – if present
#################################################
# Getting the index of the task – to find the right items

$ndx = [array]::IndexOf($taskDetailsContent.id,$newTask.id)

If($taskDetailsContent[$ndx].description){

$taskDescription = $taskDetailsContent[$ndx].description

$Request = @”
{
“description”: “$taskDescription”
}
“@

$headers = @{}
$headers.Add(‘Authorization’,’Bearer ‘ + $Token.AccessToken)
$headers.Add(‘If-Match’, $planContent.’@odata.etag’)
$headers.Add(‘Content-Type’, “application/json”)
$headers.Add(‘Content-length’, + $Request.Length)

$uri = “https://graph.microsoft.com/beta/tasks/” + $taskId + “/details”

Invoke-WebRequest -Uri $uri -Method PATCH -Body $Request -Headers $headers

}

$taskChecklist = $taskDetailsContent[$ndx].checklist | ConvertTo-Json

If($taskChecklist.Length -gt 6){

# Swap out first GUID for a new one (also need to remove the read-only properties…)

$clNew = new-object system.text.stringBuilder

$pattern = “`{`”[a-fA-F0-9]{8}-([a-fA-F0-9]{4}-){3}[a-fA-F0-9]{12}”

$lastStart = 0
$null = ([regex]::matches($taskChecklist, $pattern) | %{
$clNew.Append($taskChecklist.Substring($lastStart, $_.Index – $lastStart))
$guid = [system.guid]::newguid()
$clNew.Append(“{`”” + $guid)
$lastStart = $_.Index + $_.Length
})
$clNew.Append($taskChecklist.Substring($lastStart))

$taskChecklist = $clNew.ToString()

# Remove the read only fields from the checklist json

$clNew = new-object system.text.stringBuilder

$pattern = “`”lastModifiedBy”

$lastStart = 0
$null = ([regex]::matches($taskChecklist, $pattern) | %{
$clNew.Append($taskChecklist.Substring($lastStart, $_.Index – $lastStart – 52))
$lastStart = $_.Index + $_.Length + 145
})
$clNew.Append($taskChecklist.Substring($lastStart))

$taskChecklist = $clNew.ToString()
$Request = @”
{
“checklist”: $taskChecklist
}
“@

$headers = @{}
$headers.Add(‘Authorization’,’Bearer ‘ + $Token.AccessToken)
$headers.Add(‘If-Match’, $planContent.’@odata.etag’)
$headers.Add(‘Content-Type’, “application/json”)
$headers.Add(‘Content-length’, + $Request.Length)

$uri = “https://graph.microsoft.com/beta/tasks/” + $taskId + “/details”

Invoke-WebRequest -Uri $uri -Method PATCH -Body $Request -Headers $headers

# Start-Sleep -s 2

}
}
}
}

My finished clone – I haven’t set the same items to ‘show on card’ but you can see from the following screenshot that all the details are there – and I am seeing some issues with the ordering of tasks – I think we have a bug there – but a great way to use templates until we have in product support!

image

image

Enjoy!  And any questions just let me know.  The following gif shows the diffsync in Planner reflecting the updates as the cloning script runs.

BlogClone

Project and Project Server February 2017 Updates Released

$
0
0

Earlier this week saw the delayed release of the February 2017 Public Update (PU) for Project Server 2013 and 2016. Client updates were released February 7th as planned and server updates on February 21st  The client updates release on the first Tuesday of the month and server on the second Tuesday release schedule – although this month the server was delayed a month – and the ‘roll-up’ package for Project Server 2013 was not released.  Normal service and the full set of patches should continue in March (7th for client – 14th for server).

I’m dropping the 2010 updates as none are released this month – if we do have any going forward then I will mention those separately.

We are now delivering as Public Updates, although Server fixes are shipped just via the Download Center and not via Microsoft Update (Unless there is a security element or a fix deemed essential – this month one of the SharePoint Server 2016 fixes is a security fix).  These are still all cumulative and include fixes released in all previous updates since the last baseline (Initial release for 2016 and SP1 for 2013.

Feel free to open a support case if you have any questions around this or need assistance getting these patches deployed.

We should be back to ‘normal’ install times now (I patched both my 2013 and 2016 systems in a couple of hours) – but leaving this comment here just in case…  One point to note is the installation of the Project Server 2016 package (SharePoint Server) for September 2016 and beyond can take longer to install than previous 2016 updates, on my slow server it took several hours, so you should ensure you test installation in a similar environment to production to ensure you allow enough downtime.

The 2013 PU releases also have a real prerequisite of the appropriate Service Pack 1 (SP1), and links for SP1 are given below.  SP1 is enforced in this release, so you will find out (as I did) if you really do have SP1 for all your installed components and language packs!  This also means RTM is no longer supported!  See http://blogs.technet.com/b/stefan_gossner/archive/2015/04/15/common-issue-april-2015-fixes-for-sharepoint-2013-cannot-be-installed-on-sharepoint-2013-sp1-slipstream-builds.aspx too which describes an issue you might see if you don’t have the ‘right’ SP1.  Slipstream would work with the original SP1 – but the updates require the re-released SP1.  Since the May PU this shouldn’t be an issue – but including here just in case.

Another important point to add here is that there was in early 2013 running the SharePoint Configuration Wizard on a server with Project Server 2013 installed – this is fixed by applying the April 2013 or later– so a good practice would be to load SP1, then the current PU and then run the configuration wizard (if you didn’t already load the April 2013 through June 2014 CU).

Also worth noting that mainstream support for Project and Project Server 2010 ended October 13th 2015 – see https://support.microsoft.com/en-us/lifecycle

Also a reminder – an SP1 patched 2010 system (with no SP2) is no longer supported – see the Lifecycle site for more information – http://support.microsoft.com/lifecycle/search?sort=PN&alpha=project+2010&Filter=FilterNO

Project 2016

An overview of all the Office 2016 releases for February 2017 can be found here – https://support.microsoft.com/en-us/help/4010765/february-2017-update-for-microsoft-office – February, 2017, update for Microsoft Office

Project Server 2016

With the 2016 release we just have a single patch (but this month the single patch comes in two parts… a wssloc and sts2016 part). – as we have also the single msi for installation of SharePoint Server 2016 (Project Server still needs licensing separately though). Both parts need installing before the config wizard is executed.  The sts2016 part of the patch also contains security fixes so is released via Microsoft Update, the Update catalog as well as the download center.

February 14, 2017, update for SharePoint Server 2016 (KB3141515) – Includes Project fixes, like the roll-up patch in Project Server 2013.

https://support.microsoft.com/en-us/help/3141515/february-14-2017-update-for-sharepoint-server-2016-kb3141515

February 21, 2017, update for SharePoint Server 2016 (KB3141517) –  Can includes Project fixes, like the roll-up patch in Project Server 2013 – but doesn’t this month.

https://support.microsoft.com/en-us/help/3141517/february-21-2017-update-for-sharepoint-server-2016-kb3141517

There is no database schema update this month so it remains at 16.0.4483.1000 – or will get updated to this if you hadn’t patched in January.  Remember, Project Server 2016 data is in the content database.  The version number 16.0.4498.1002 can be used to control the connecting client to the February 2017 level.  For reference – the RTM build number seen for the DB schema would be 16.0.4327.1000.

Project 2016 Client Package:

February 7, 2017, update for Project 2016 (KB3141514)

https://support.microsoft.com/en-us/help/3141514/february-7-2017-update-for-project-2016-kb3141514

The version of Project Professional 2016 will be updated to 16.0.4498.1002 in the properties for WinProj.exe.  In 2016 we don’t do a good job of displaying the version in File, Account, About Project – we only display the MSO version and not the specific Project version (You can confirm this by looking at the version of winproj.exe – in (default for 32 bit) C:\Program Files (x86)\Microsoft Office\Office16)

If you have Click to Run and using Project Pro for Office 365 at the ‘16’ level, then your version will depend on which update frequency you have set. Take a look at https://blogs.office.com/2016/02/09/deferred-channel-build-now-available-for-the-office-365-client-apps/ for a few changes in this area – Current Branch for Business is now called Deferred Channel. We are aware that we don’t appear to expose the full change details for Project and are looking into it – you should start seeing more here soon.

Project and Project Server 2013

An overview of all the Office 2013 releases for February 2017 can be found here – https://support.microsoft.com/en-us/help/4010765/february-2017-update-for-microsoft-office – February, 2017, update for Microsoft Office

This include a number of fixes, so Microsoft strongly recommends that you test this in a test environment based on your production environment before putting this fix live in production.

The article below provides information on how to deploy the Project Server Cumulative Update.

You can read about the fixes included in the Project and Project Server February PUs from the following articles:

Project Server 2013 Server Rollup Package

No rollup package this month – see https://support.microsoft.com/en-us/help/4010765/february-2017-update-for-microsoft-office for any other SharePoint Server 2013 patches you may wish to load along with the Project Server 2013 individual package

Project Server 2013 Individual Project Package – (cumulative, but only the Project Server fixes):

February 21, 2017, update for Project Server 2013 (KB3141525)

https://support.microsoft.com/en-us/help/3141525/february-21-2017-update-for-project-server-2013-kb3141525

There is a database schema update this month – and the dbo.Versions table should show 15.0.4903.1000 after applying the February 2017 PU.  The version number 15.0.4903.1000 can be used to control the connecting client to the February 2017 PU level.

SP1 for Project Server 2013 can be found here – http://support.microsoft.com/kb/2880553

Project 2013 Client Package:

February 7, 2017, update for Project 2013 (KB3141499)

https://support.microsoft.com/en-us/help/3141499/february-7-2017-update-for-project-2013-kb3141499

The client version number will be 15.0.4903.1000.  The server scheduling engine is no longer blocked by version control since the November 2014 CU on the server, so providing you have November 2014 CU or above on the server you can use the 15.0.4903.1000 value to control connection of the February 2017 PU patched client.

If you are running a server CU earlier than November 2014 CU, then follow the suggested version number for the server patch level you are running.  See Project Server 2013- Controlling the version of connecting clients–and PWA edits- for more details.  As mentioned above – the version number entered no longer controls the server side scheduling engine – so from the November 2014 CU release onward you can set a higher version to control clients without blocking the server side scheduling in the schedule web part.

SP1 for Project Professional 2013 can be found here – http://support.microsoft.com/kb/2817433

Also note that Click to Run installations will be automatically patched.  Installations in Enterprise Environments that have been modified will be deployed based on the schedule determined by your Administrator.  See http://support2.microsoft.com/gp/office-2013-click-to-run.  You may also choose to update your click to run Project Pro for Office 365 to the new 2016 level – this can still connect to Project Server 2013 – see the 2016 section above for current version.

Also a note for users of the Project client connecting to Project Online – see https://blogs.technet.microsoft.com/projectsupport/2016/12/15/using-project-online-time-to-be-sure-you-upgrade-the-client-software/ – as you will need a ‘2016’ level client to connect starting in March 2017.

Client Installation:

The instructions for installing the client patch are below.

NOTE: Microsoft strongly recommends testing within a NON-Production environment prior to rollout.

  1. Download the hotfix from the link in the KB Article.
  2. Extract the patch package by running the .exe file that you downloaded.
  3. Run the extracted .exe file to apply the patch to your Project Professional/Standard client.

Or, from February 2015 onwards use Windows Update to download and install the patches.

Microsoft Planner: Now you can recover deleted Plans and Groups

$
0
0

A big announcement yesterday and this is something our customers have really been waiting for – the ability to recover deleted Groups, and with them any other deleted group related content that was deleted along with the group.  The support article that outlines this can be found at https://support.office.com/en-us/article/Restore-a-deleted-Office-365-Group-b7c66b59-657a-4e1a-8aa0-8163b1f4eb54?ui=en-US&rs=en-US&ad=US and you also need to be running the AzureAD PowerShell v2 Preview module to get this working.  That can be installed in PowerShell following the instructions at https://docs.microsoft.com/en-us/powershell/azuread/.  I think there is also a PowerShell version requirement – but works fine on my Windows 10 laptop.

The reason our customers (and I work with Planner – so am looking at ‘our customers’ being the ones using Planner) usually lose Plans is that someone (IT admin?) sees the Group and does not recognize it and so deletes it.  The warning message when you delete it probably could be better – but along with the Group you also lose the Plan and SharePoint site that holds some of the other Group/Plan content.  I’ll walk through the scenario – right from installing the Azure module, deleting a Group – see what goes away – and then recovering.

First the Azure module.  If you open PowerShell as an admin (I usually use the ISE) then following the 2nd link above it tells you to look at https://docs.microsoft.com/en-us/powershell/azuread/ for the V2 Preview Release (2.0.0.98 as I write this) and you will need to run:

Install-Module –Name AzureADPreview

This might prompt you to install the NuGet provider – if it does then just follow those instructions first.  It prompted me – so I replied Yes.

image

The next prompt will likely be telling you that you are installing from an untrusted repository – I decided that I did trust the repository and responded Yes at the following prompt.

image

If you have installed a previous version of the preview – it may tell you that you need to use the –Force option – in my case on one of my machines I had used an earlier preview – but not on this particular machine – so I was good to go.

To run any Azure commands you need to log in to Azure – so the first command is Connect-AzureAD – which will throw up a prompt and you can log in to your Office 365 tenant.

image

OK, now to the fun part.  I have a Plan – that has a Group and a SharePoint site.

imageimageimage

And from the Group view I can select Edit Group – then Delete Group (You might also delete via PowerShell or the AD Portal).

image

I check the box – and the Group is gone.  So too the Plan.  You might see ‘Can’t get the Group data’ just after deletion if you browse to the plan – then finally once it has gone you get an ‘Oops, something went wrong…”

image“. 

The SharePoint takes a little longer from what I have seen, – and eventually you’ll get a 403 if you try and navigate directly to it.  Same with the mailbox too – in fact I stopped waiting for that to go away just so I could get on with this blog (I assume it would finally be gone).

So now I have a deleted Group – or rather what we term a ‘soft deleted’ Group – as I can still recover it.

If I follow the support document listed at the top of this blog I can look for the deleted Group – and recover it, by using

Get-AzureADMSDeletedGroup – to get the Id – and then

Restore-AzureADMSDeletedDirectoryObject –Id <objectId> to recover the information

image

Once the script runs it does take a little while to recover everything – best to give it a while to get everything straight – as I have seen some odd but expected behavior if you try and go in before everything is restored.  For example if the SharePoint recovery isn’t complete you won’t see the attachments – and the checklists seemed to also not come back as quickly.  If you do see any behavior like this then generally a fresh browser or clearing the cache in your browser will get everything working (Once all the data is returned).  I did see when preparing this blog and using the same browser session that I didn’t see my attachments and also my new bucket was not present and all tasks were in the To Do bucket.  Going to a InPrivate/Incognito window for the browser showed me the expected fully recovered plan.

Obviously I could just use my previous screenshots to show that the recovery worked Smile – but I didn’t – here is a view of the Notebook – freshly recovered:

image

And the answer to that question is a resounding YES!

We do also mention that although restore generally takes a few minutes it can at times take as long as 24 hours.  Get-AzureADGroup –ObjectId <objectId> will confirm if the Group is back.

The recovery is only available for 30 days from the time of deletion – and also this is only suitable for complete recovery of a deleted Group/Plan – we don’t have any capabilities to recover a deleted task or bucket from your plan – and no Point In Time Recovery options for Groups/Plans.

For more details of other AzureAD changes that came along with these new commands please see the PowerShell Gallery link above – and also the recent article from Rob de Jong and Curtis Love – https://docs.microsoft.com/en-us/azure/active-directory/active-directory-accessmanagement-groups-settings-cmdlets.  I’ll be refreshing some of my earlier blog posts on Planner around the new commands for Group creation control.

A different kind of post–thanks Kate!

$
0
0

I don’t usually post non-technical stuff here – but today one of my long-time colleagues has her last day at Microsoft.  After 22 years Kate Simpson has decided to retire and we will certainly miss her, and I know many of our customers will too!  Kate spent all those years supporting mostly Project, but in her earlier years, DOS, Multiplan, Excel, WfW…  Here is Kate’s bookshelf – a history of Project – or at least as long as we had boxed product. 

 

image

I’ve been fortunate to have shared over half that time working with Kate and her help and assistance has been invaluable to all of us on the team – all over the world.  Kate was recounting some ‘field trips’ yesterday – and an onsite to Maui got the top spot – but I know she also enjoyed her time in India training our team in Bangalore – some who are still there – and many others who went on to other roles in the Project world, for Microsoft and beyond.

170330_Kate_012 (2)

Kate and Mike plan to finish off the National Parks that they haven’t already visited (22 to go I think) and I’m sure she will soon wonder how she ever found the time to go to work!  Thanks Kate!

Planner: Group control–new PowerShell commands

$
0
0

Since posting my blogs that covered control of Group creation using PowerShell as a means of setting who can and cannot create Plans (which create Groups) there has been a new release of the Azure AD PowerShell module which supersedes the ‘v1 Preview’ that contained the earlier commands – such as Get-MSOLAllSettingTemplate and New-MsolSettings.  The new modules are still ‘Preview’ but in v2 and the new module details can be found at https://docs.microsoft.com/en-us/azure/active-directory/active-directory-accessmanagement-groups-settings-cmdlets and the PowerShell v2 Preview module can be installed in PowerShell using the following command from within PowerShell:

install-module -Name AzureADPreview -RequiredVersion 2.0.0.85

I’m guessing this will get updated – I only show that version as that is the 3/17/2017 release that you need – and that command will ensure that any earlier versions you may have installed are updated.  As with all Azure AD commands – the first thing you will need to do is connect and log in:

Connect-AzureAD

This will pop up a login dialog – just use your Office 365 credentials (I’m assuming you are an admin) and you should see your Account, Environment and Tenant details returned.

The logic for controlling group creation is pretty much the same – and builds on the previous commands such that the new commands will read the previous settings.  For example I can use the new commands to read current settings to see what I have set in my test tenant.  Get-AzureADDirectorySetting replaces Get-MsolAllSettings.  The old commands will still work if you have the v1 Preview module installed – but you can no longer download the v1 Preview module:

Get-AzureADDirectorySetting -All $true | Format-Table Id, DisplayName

This returns a formatted table just showing the Id and name:

Id                                                              DisplayName 
—                                                                ———– 
78589c63-72cd-47d2-a187-86092a5f16e7   Group.Unified

To enumerate all the settings values we can use the new command Get-AzureADDirectorySetting with the –All parameter set to true then loop through the objects (settings):

Get-AzureADDirectorySetting -All $True | where-object {$_.DisplayName -eq “Group.Unified”} | ForEach-Object Values

This returns my current settings:

Name                                                  Value                              
—-                                                      —–                              
ClassificationDescriptions                                       
DefaultClassification                                            
PrefixSuffixNamingRequirement                                    
AllowGuestsToBeGroupOwner               False                              
AllowGuestsToAccessGroups                 True                               
GuestUsageGuidelinesUrl                                          
GroupCreationAllowedGroupId              7edd1d0b-557d-43e6-b583-4f3e0198c167
AllowToAddGuests                                True                               
UsageGuidelinesUrl                                               
ClassificationList                                               
EnableGroupCreation                            False

If you are watching closely you will notice there are a few more settings now compared to v1, and I have highlighted the new ones.  I’ll concentrate of the lower ones in this post as the other ones don’t really affect Planner (yet) but soon they will – and I will post again!

My configuration is to only allow my users to create Groups if they are in the Group with Id of 7edd1d0b-557d-43e6-b583-4f3e0198c167.  I can use the following command to get that group:

Get-AzureADGroup -ObjectId 7edd1d0b-557d-43e6-b583-4f3e0198c167

ObjectId                                                        DisplayName              Description                  
——–                                                          ———–                   ———–                  
7edd1d0b-557d-43e6-b583-4f3e0198c167      CanCreateGroups       Users allowed to create groups

Everything is set as it was before when I used the old Msol commands – but if I was starting from scratch what would I do?  I can start by removing my settings and walk through the steps to get them back:

$SettingId = Get-AzureADDirectorySetting -All $True | where-object {$_.DisplayName -eq “Group.Unified”}

Remove-AzureADDirectorySetting -Id $SettingId.Id

The steps to create a new set of settings are to read the settings template for unified groups, then set the settings and finally to save as a new set of settings.  All set?

$template = Get-AzureADDirectorySettingTemplate | where-object {$_.DisplayName -eq “Group.Unified”}

If you take a look at the $Template object while you have it in PowerShell then the $Template.Values | fl gives a nice list of the names and descriptions of the settings if you are interested in trying out some of the others.  For now I’m just going to set the ones I’m interested in.  I’m also going to hard code the GroupId – but at the foot of this blog I’ll include a couple of options to populate a variable with specific groups:

$settings = $template.CreateDirectorySetting()

$settings[“GroupCreationAllowedGroupId”] = ‘7edd1d0b-557d-43e6-b583-4f3e0198c167’

$settings[“AllowToAddGuests”] = “true”

$settings[“UsageGuidelinesUrl”] = “http://aka.ms/o365g

$settings[“ClassificationList”] = “Low,Medium,High”

$settings[“EnableGroupCreation”] = “false”

New-AzureADDirectorySetting -DirectorySetting $settings

And then I can confirm these are set using the same command as above Get-AzureADDirectorySetting -All $True | where-object {$_.DisplayName -eq “Group.Unified”} | ForEach-Object Values and I see these settings – some of the ones I didn’t set take their default values.

Name                                          Value                              
—-                                              —–                              
ClassificationDescriptions                                       
DefaultClassification                                            
PrefixSuffixNamingRequirement                                    
AllowGuestsToBeGroupOwner     False                              
AllowGuestsToAccessGroups       True                               
GuestUsageGuidelinesUrl                                          
GroupCreationAllowedGroupId   7edd1d0b-557d-43e6-b583-4f3e0198c167
AllowToAddGuests                      True                               
UsageGuidelinesUrl                     Http://aka.ms/o365g                
ClassificationList                          Low,Medium,High                    
EnableGroupCreation                  False

As before you can re-open the settings object to update the values – or sometimes easier to remove and re-create as I have here.  I haven’t checked in v2 Preview – but in v1 if you removed then the settings still held true – you needed to set the EnableGroupCreation to True rather than just remove the settings.

Thanks to Rob de Jong and Rob Whaley for their guidance and input on using these new commands, and particularly the 2nd Rob for these commands to set a variable to use as the ‘allowed’ group – where $GlobalAdminsObjectID can be used in place of my hard-coded group.

# If we want to control who can create groups we can do the following:

# We can use this for canned Azure Roles:

$GlobalAdmins = Get-AzureADDirectoryRole | ? { $_.DisplayName -like “Company Administrator”}

$GlobalAdminsObjectID = $GlobalAdmins.ObjectId.ToString()

# Or we can create a security group and set its object id as a variable:

New-AzureADGroup -Description “Security Group for users allowed to create Office 365 Groups” -DisplayName “Office 365 Group Creators” -MailEnabled $false -SecurityEnabled $true -MailNickName “O365GC”

$GlobalAdminsObjectID = (Get-AzureADGroup -SearchString “Office 365 Group Creators”).ObjectId.ToString()

# Or we can call an existing security group and set its object id as a variable:

$GlobalAdminsObjectID = “b39e2044-a139-4463-8c9a-4578e43676ca”


Planner: Cloning a Plan with multiple assignments

$
0
0

I updated my previous posting with a rough re-write of my cloning PowerShell – but I have completed a complete re-write that handles the objects in PowerShell much better (thanks for the feedback and guidance Tarkan!).

Here is a zipped up version of the ps1 – plannerclonemultiassignv3

I’m starting from a similar ‘template’ as before and my plan looks like this:

image

Some of the tasks have dates set for start and due dates – some haven’t.  In my code if I find a date I’m adding 7 days to it for the clone just to show how you could move dates – for a real template scenario you’d probably pass in a parameter.  The PowerShell ps1 is attached – and I’ll also walk through some of the code pointing out some of the more interesting bits (at least to me!)

I won’t go over the initial part again where I get the token and use the AppId – so if you are fresh to this topic you might want to read the previous blog post first – https://blogs.msdn.microsoft.com/brismith/2017/02/17/microsoft-planner-how-to-clone-a-plan-with-graph/.  The main changes in the Graph for this update is the move to multi-assign – so not the assignedTo has changed to an assignment collection of plannerAssignments – see https://developer.microsoft.com/en-us/graph/docs/api-reference/beta/resources/plannertask for the documentation on this.  Another big change is the endpoints now all have ‘planner’ added before the entity – so for example to get tasks you now go to GET /planner/tasks/<id> – where id is the Plan Id.

The changes in the flow of my code is that I now create all the buckets and then all the tasks and then add all the task details – rather than doing the other entities within each bucket in one loop.  I’m keeping track of the old/new Ids for buckets and tasks in hash tables, and I’ve renamed my variables to hopefully make it more obvious what I’m doing.  I’m also manipulating the returned objects more in PowerShell rather than building up new objects – so removing read-only fields or ones I don’t want to set, then setting where necessary, like new Ids for buckets, and also making sure all checklists are unchecked.  I’m also updating a few fields – like the dates as mentioned above – and also setting the orderHints as appropriate (basically adding <existingHint>P!’ to the current one should do the trick, although I’m still not seeing the order quite as I expect…).  I’ll try and get a better answer on that one.  As you will notice – not much has changed with error and exception handling…

So, on with the code.  Not all of the code is listed here – but the ps1 is attached.  I’ll jump in at the point where I have authenticated, grabbed a token and already read the template details and created the Group, added the members and created the Plan – so first thing is to set the categories I’m using on the plan (this might be a useful stand-alone thing just to stamp all your plans with a consistent set of categories!)

$newCategoryDescriptions = $templatePlanDetailsContent.categoryDescriptions | ConvertTo-Json

# Do a GET so I have the current etag
$headers = @{}
$headers.Add(‘Authorization’,’Bearer ‘ + $Token.AccessToken)

$uri = “https://graph.microsoft.com/beta/planner/plans/” + $newPlanId + “/details”
$result = Invoke-WebRequest -Uri $uri -Method GET -Headers $headers
$newPlanDetailsContent = $result.Content | ConvertFrom-Json
$Request = @”
{
“categoryDescriptions”: $newCategoryDescriptions
}
“@

$headers = @{}
$headers.Add(‘Authorization’,’Bearer ‘ + $Token.AccessToken)
$headers.Add(‘If-Match’, $newPlanDetailsContent.’@odata.etag’)
$headers.Add(‘Content-Type’, “application/json”)
$headers.Add(‘Content-length’, + $Request.Length)
$headers.Add(‘Prefer’, “return=representation”)

$uri = “https://graph.microsoft.com/beta/planner/plans/” + $newPlanId + “/details”

$result = Invoke-WebRequest -Uri $uri -Method PATCH -Body $Request -Headers $headers

The first line is just taking my categoryDescriptions property and converting to json – which is just what I need in my $Request.  I then get the current etag that I’ll need for my header – and that’s about it.  As you can see the $uri goes to “https://graph.microsoft.com/beta/planner/plans/” + $planId + “/details” with the new ‘planner’ identifier.

Next I walk through my buckets, creating a hash table so I can use this as a lookup to see which tasks should land in which buckets later.

$bucketHashTable = @{}

ForEach($templateBucket in $templateBucketsValue){

$templateBucketId = $templateBucket.Id

$templateBucket.PSObject.Members.Remove(“Id”)
$templateBucket.PSObject.Members.Remove(“@odata.etag”)
$templateBucket.orderHint = “$($templateBucket.orderHint) $($templateBucket.orderHint)P!”
$templateBucket.planId = $newPlanId

$Request = @”
$($templateBucket | ConvertTo-Json)
“@

$headers = @{}
$headers.Add(‘Authorization’,’Bearer ‘ + $Token.AccessToken)
$headers.Add(‘Content-Type’, “application/json”)
$headers.Add(‘Content-length’, + $Request.Length)
$headers.Add(‘Prefer’, “return=representation”)

$newBucket = Invoke-WebRequest -Uri “https://graph.microsoft.com/beta/planner/buckets” -Method Post -Body $Request -Headers $headers
$newBucketContent = $newBucket.Content | ConvertFrom-Json
$newBucketId = $newBucketContent.id

$bucketHashTable.Add($templateBucketId, $newBucketId)
}

Similar to my newCategoryDescriptions I can re-use the object I have – and am just trimming out the old Id (keeping a copy first) and the etag – then correcting the newPlanId and the orderHint – and all is good!  I put the old and new bucket Ids into my hash table.

Tasks and assignments are all in the task object – so I can add them all in one rather than using the POST then PATCH that I did in the earlier blog post.  I end up removing rather a lot of the members of this object – and it may have been easier to just build a clean one – but thought it worth doing it this way for illustration of the members involved.  Another key part here is taking a copy of the object rather than working on the object from my ForEach, as in PowerShell the objects are just references – so removing something in the loop removes from the collection.

$taskHashTable = @{}

ForEach($templateTask in $templateTasksValue){

$tempTask = $templateTask.PSObject.Copy()

$tempTask.PSObject.Members.Remove(“Id”)
$tempTask.PSObject.Members.Remove(“@odata.etag”)
$tempTask.PSObject.Members.Remove(“createdDateTime”)
$tempTask.PSObject.Members.Remove(“createdBy”)
$tempTask.PSObject.Members.Remove(“conversationThreadId”)
$tempTask.PSObject.Members.Remove(“percentComplete”)
$tempTask.PSObject.Members.Remove(“hasDescription”)
$tempTask.PSObject.Members.Remove(“referenceCount”)
$tempTask.PSObject.Members.Remove(“checklistItemCount”)
$tempTask.PSObject.Members.Remove(“activeChecklistItemCount”)
$tempTask.PSObject.Members.Remove(“assigneePriority”)
$tempTask.PSObject.Members.Remove(“previewType”)
$tempTask.PSObject.Members.Remove(“completedDateTime”)
$tempTask.PSObject.Members.Remove(“completedBy”)
If($tempTask.startDateTime){
$tempTask.startDateTime = ([DateTime]$tempTask.dueDateTime).AddDays(7)
}
If($tempTask.dueDateTime){
$tempTask.dueDateTime = ([DateTime]$tempTask.dueDateTime).AddDays(7)
}
$tempTask.orderHint = “$($tempTask.orderHint) $($tempTask.orderHint)P!”
$tempTask.planId = $newPlanId
$tempTask.bucketId = $bucketHashTable.Get_Item($templateTask.bucketId)

$assignees = $tempTask.assignments.PSObject.Properties | Select Name, Value

ForEach($assignee in $assignees){
$assignee.Value.PSObject.Members.Remove(“assignedBy”)
$assignee.Value.PSObject.Members.Remove(“assignedDateTime”)
$assignee.Value.orderHint = “$($assignee.orderHint) $($assignee.orderHint)P!”
}

$Request = @”
$($tempTask | ConvertTo-Json)
“@

$headers = @{}
$headers.Add(‘Authorization’,’Bearer ‘ + $Token.AccessToken)
$headers.Add(‘Content-Type’, “application/json”)
$headers.Add(‘Content-length’, + $Request.Length)
$headers.Add(‘Prefer’, “return=representation”)

$newTask = Invoke-WebRequest -Uri “https://graph.microsoft.com/beta/planner/tasks” -Method Post -Body $Request -Headers $headers
$newTaskContent = $newTask.Content | ConvertFrom-Json
$newTaskId = $newTaskContent.id

$taskHashTable.Add($templateTask.Id, $newTaskId)

}

As I mentioned earlier – I’m just adding 7 days – but you could pass in a value.

Finally I set the task details –and again this is by maniulating a copy of the object from the template and handling the previewType and checklists all in one loop.  You could also add the references in here easily enough – but one consideration is where the references refer to.  If they are public web references or static internal links then all is fine – but if they refer to items stored in the Group’s SharePoint site then you’n need to consider how you want to handle this.  Possibly you’d need to copy the contents across and update the references accordingly.

ForEach($templateTaskDetailContent in $templateTaskDetailsContents){

$newTaskDetailContent = $templateTaskDetailContent.PSObject.Copy()

$headers = @{}
$headers.Add(‘Authorization’,’Bearer ‘ + $Token.AccessToken)
# Getting the current etag
$uri = “https://graph.microsoft.com/beta/planner/tasks/” + $taskHashTable.Get_Item($templateTaskDetailContent.Id) + “/details”

$result = Invoke-WebRequest -Uri $uri -Method GET -Headers $headers
$freshEtagTaskContent = $result.Content | ConvertFrom-Json
$newTaskDetailContent.PSObject.Members.Remove(“@odata.context”)
$newTaskDetailContent.PSObject.Members.Remove(“@odata.etag”)
$newTaskDetailContent.PSObject.Members.Remove(“id”)
$newTaskDetailContent.PSObject.Members.Remove(“references”)

$checklist = $newTaskDetailContent.checklist.PSObject.Properties | Select Name, Value

ForEach($checkItem in $checklist){
$checkItem.Value.PSObject.Members.Remove(“lastModifiedBy”)
$checkItem.Value.PSObject.Members.Remove(“lastModifiedDateTime”)
$checkItem.Value.isChecked = “false”
$checkItem.Value.orderHint = “$($checkItem.Value.orderHint) $($checkItem.Value.orderHint)P!”
}

$Request = @”
$($newTaskDetailContent | ConvertTo-Json)
“@

$headers = @{}
$headers.Add(‘Authorization’,’Bearer ‘ + $Token.AccessToken)
$headers.Add(‘If-Match’, $freshEtagTaskContent.’@odata.etag’)
$headers.Add(‘Content-Type’, “application/json”)
$headers.Add(‘Content-length’, + $Request.Length)

$uri = “https://graph.microsoft.com/beta/planner/tasks/” + $taskHashTable.Get_Item($templateTaskDetailContent.Id) + “/details”

$result = Invoke-WebRequest -Uri $uri -Method PATCH -Body $Request -Headers $headers
}

My new clone plan looks like the following – and as mentioned I’m still trying to get the right logic for the ordering of tasks and checklist – the buckets look just fine. It might come down to sorting the items within the buckets/lists in PowerShell and then applying a new orderHint as per the documents https://developer.microsoft.com/en-us/graph/docs/api-reference/beta/resources/planner_order_hint_format.  I’m keen to get some good examples of that – as I see of StackOverflow that a few of you are struggling with orderHint.

image

Microsoft Planner: Why can’t I add favorites or comment?

$
0
0

This has been a hot topic for support cases since Microsoft Planner first launched – so why are there cases where you are not able to comment or add a favorite plan?  There can be several reasons, and it is sometimes even further confused when we tell customer that this isn’t supported for Hybrid installations where Exchange is still used on premises – and the reply we get is “Well it works for some of our users but not others…”  Hopefully the following will cover the vast majority of cases you might come across – but I’m sure there may be other nuances in this complex world that might give different behaviors…  Read on.

Hybrid Scenarios

In hybrid scenarios we do not support comments or favorites for users with their mailboxes on-premises as the users needs to have an Exchange Online mailbox to engage with these features.  EXCEPT – if the user does have an external email address configured and is configured as a MailUser in Exchange AD.  In this scenario the user will be able to use these features even though they have no EXO mailbox.  If the user is configured as a User rather than MailUser – and has no external email address then things won’t work.  This has explained the situations where some users work and some don’t in a hybrid environment.  Also of course some users may be fully in EXO and others in on-premises mailboxes.

Yammer connected Groups

One recent feature change was the ability to have Groups for Yammer groups – and these Groups can have plans.  However, as they are using Yammer rather than Exchange for their conversations this does not work with comments and favorites – both of which rely on the EXO mailbox.  We should be removing some UI in this scenario which is confusing things currently making it look like it should work.

Soft deleted items

Another potential blocker to adding comments is when a soft deleted item  for Exchange (something still in the recycle bin) is found by the Graph call we are making and used in preference to the real current item.  This one appears to be a bug, and related usually to the ‘From:’ field we are passing in to the call.  If you are hitting this problem then either purging the item or renaming the soft deleted item should enable comments to work again.

Microsoft Teams

I think we did see some issues with Plans in teams and conversations – but some of these can be explained by the details above – and some we have already addressed.  We continue to work on improving the Planner experience in Teams and you should see goo things here coming over the remainder of this year.

Microsoft Planner: Considerations for Reporting–Part 1

$
0
0

I was very fortunate recently to get the chance to present at Microsoft Ready, an internal worldwide training event held for the first time in Las Vegas – after running for many years in Seattle.  My session was on Microsoft Planner and I can’t repeat everything here as it was an internal event – but wanted to share part of the session that covered reporting in Planner.  We don’t have any built in reporting yet, but using MSGraph and/or Flow and some Azure bits and pieces you can certainly report against your Planner information.

In Part 1 I will cover the basics of reporting in Planner, covering what makes sense when reporting, and what are the key data elements of Planner.  In Part 2 I will look at Flow and how you could use that as a way to propagate your Planner information to a better reporting store, then I’ll finish with Part 3, where I pull all my Planner data out using Python and load up into Azure Cosmos DB before reporting against that data with Power BI.

But before getting into the technical stuff – what type of reports make sense for Planner?  What are you looking for and what makes sense to report on?  Unlike Project Online, in Planner there is no concept of ‘work’ or ‘effort’ – and although the tasks do have a percentComplete field this is used in Planner as being 0, 50 or 100 – meaning not started, in progress or finished.  So when reporting in Planner, are you really comparing apples with apples – or are they pears?  And if they are apples, are they the same apples?  We need consistency.

image

If you’ve seen my previous blog posts on Planner you may have come across my PowerShell examples that allow you to clone plans by making a copy of the categories, buckets, tasks, checklists and assignments into a new plan.  One win here is consistency of categories (the colored fly-outs) and buckets.  This ensures that you can be consistent when reporting across all your plans – maybe having a regular count of how many tasks you have in each bucket – or how many are tagged with each category.  Both buckets and categories are unique to a Plan – but being consistent across plans enables you to then report consistently.  For example if you had a standard set of categories for most of your plans (or different sets of categories for sets of plans of the same type) then you could group these plans based on the category.  More later, with an example in Part 3.

I’ve mentioned tasks, buckets, categories etc. but lets take a look at the various Graph calls that you’d find useful when working with Planner data along with the datasets returned.  One challenge here is that some of the calls will only get your plans – so depending on your requirements you might need to ensure that your reporting ‘person’ or service account – is a member for all the groups of interest.  We will likely be adding more APIs in Graph to facilitate reporting at some point.

First we can use the https://graph.microsoft.com/v1.0/me/Planner/Plans call to get all of the plans that we are subscribed to – and the concept of being subscribed needs a little explanation.  If you create a plan then you are subscribed – and others can become subscribed too.  However, if you create a plan programmatically, for example using the PowerShell script you do not become subscribed until you open the plan (I found this out the hard way).  Also there is a limit to numbers of subscriptions – so you can find yourself unsubscribed in some cases.

You can follow along in the Graph Explorer - https://developer.microsoft.com/en-us/graph/graph-explorer and the response to the me/Planner/Plans call will look something like this, where this is just the first plan of many:

image

It matches up to the UI of the plan:

image

The title in this case is Template, and the other information is that this plan was created by user with id "cf091cb1-dc23-4e12-8f30-b26085eab810" – which happens to be me.  This was created by the Planner web app – but we could see a different application id if for example we had used a PowerShell script and used a client id.  The owner field is actually the id of the Group that this plan belongs to.  For plans created through the Planner web app this will usually be a 1 to 1 relationship, but with the advent of Microsoft Teams you can now see multiple plans ‘owned’ by the same Group.  The eTag value can be thought of as a version identifier.  If the same eTag is found then nothing has changed.  This is used more for making updates, where the current eTag must be passed in to any update call to ensure you are updating from the current version – and someone else hasn’t updated the record since you made the last read.

Another way to get all of our plans would be to get them from the Groups.  With the https://graph.microsoft.com/v1.0/groups?$filter=groupTypes/any(c:c+eq+'Unified') call we can get all of our Office 365 Groups.  If we omitted the filter for groupTypes we would also see  any Exchange Groups (DLs) we might have.  Once we have the Group – like this one:

image

we can use the id "6ff15978-94d4-414f-a497-295a245718bc" to get all the plans in the Group using the call to https://graph.microsoft.com/v1.0/groups/6ff15978-94d4-414f-a497-295a245718bc/Planner/Plans

image

Here we pull back the "Fall New Employee Orientation" plan, which was created using my PowerShell – which had the client id "3a32234d-52b2-4224-a4cd-aaa9819e66af".

In this example using the id of a Group that was created when adding a new Team, which also has two Planner tabs added – I can see multiple Plans in the group - https://graph.microsoft.com/v1.0/groups/400ea8d7-334b-4d26-b963-8fe400fc675a/Planner/Plans

image

I can also see these were created via Teams as the application id is different - "5e3ce6c0-2b1f-4285-8d4b-75ee78787346".

Using either of the approaches above to get the Plans, and their ids I can then use other Graph calls to get the rest of the items of interest.

https://graph.microsoft.com/v1.0/Planner/Plans/<Plan_ID>/Details will get me the Plan Details – notably the category descriptions:

.image

https://graph.microsoft.com/v1.0/Planner/Plans/<Plan_id>/Buckets will get me the buckets, and even though I may have the same named buckets across my plans, particularly if I am using a template, each will have a unique id for the same named bucket in different plans.  We need to use this to correctly match up the tasks to buckets – but we might want to do some transforming of the data to use the name instead for our reporting:

image

https://graph.microsoft.com/v1.0/Planner/Plans/<Plan_id>/Tasks gets us down to the meat in Planner – and the tasks.  Points of interest here are the planId and bucketId so we know where these tasks fit – then the title, percentComplete and the assignments collection are the most commonly required fields.  The first GUID in each element of the assignments collection is the identifier for the assigned resource – in this case Katie Jordan – which we will be getting from the Group Members call later.

image

https://graph.microsoft.com/v1.0/Planner/Tasks/<Task_id>/Details will return the task details – which contains any references, checklists (and their state) and any description – as well as the id to tie it back to the right task.

image

The final call we will make gets us the members.  This is needed for each groupId – which we can get either from the Groups if we used that route, or from the Plan owner if we started with our Plans.  https://graph.microsoft.com/v1.0/Groups/<Group_id (Plan owner)/Members For reporting purposes we may just need the id and displayName – and we would need to de-dupe after getting all our members back.

That is it for Part 1 – a quick run around the Graph Explorer to help you understand which data elements you would need as well as highlighting some of the challenges and considerations when you start a reporting journey with Planner.  And it goes without saying, but I’ll say it anyway, that for any reporting journey you need to know the destination before setting out – so be sure you have a good understanding of the outputs so you’ll know if all the inputs are present that you need – and in a suitable format.

Microsoft Planner: Considerations for Reporting–Part 2

$
0
0

A little longer getting to this than I’d hoped – but some of that time (as well as the day job) was spent doing a better job of what will be part 3 – so looking forward to writing that up!  For Part 2 though I’m looking at Flow to start with – and then seeing what I can do with the data.  Don’t expect to see a finished example here that will fit your needs – I’m really just putting ideas out there – see what might work for you.

Flow is really gathering momentum and I was surprised at the variety of triggers and actions available for both Microsoft and 3rd Party applications. Unrelated to reporting – but one example I showed at Ready was taking a picture with Adobe’s Creative Cloud, which when saved as an asset triggered the Flow and then this in turn created a Planner task in my ‘Photos’ plan.  Cool.  You can imagine taking it a bit further and getting the image attached and maybe even reading the EXIF GPS and adding that.  One extended scenario could be fixing potholes in roads – take a picture and the task to fix it gets created and assigned.

But back to reporting.  The flow I decided to use was triggered by a daily schedule (unfortunately no triggers in Planner yet), then this flows to the List my plans action – and this created a Blob in my Azure Blob storage account.  In this case just writing the value from the plan.

image

Looking at my Azure storage through the Microsoft Azure Storage Explorer I can see I have three ‘directories’ – and the one I was writing to was planner_plans.

image

Opening up that directory I see my 31 plans, where the name is the ID of the Plan and the contents is just the ID and title.

image

I then have another flow triggered by new blobs in this directory.

image

A new Blob makes a call to List tasks – using the Plan ID, then this writes to the planner_plan_tasks directory a blob with the name being the ID of the task, then a comma separated list with the task, plan and bucket IDs, then the start and due dates and the percent complete – and finally the title – and in this case a hard-coded timestamp.  The idea of the time stamp was to have snapshots – but I’m not really taking it that far yet.  So the contents would look something like:

-gNBHh_yukGpXdlmqy3IO2QAGiF8,lX-TnekUv0e8nVLMP4q6X2QABVGT,naTONxTjREe2F7SzoP9Fr2QAMS_w,2017-06-19T12:00:00Z,,100,Satisfaction,2017-07-01

The other directory isn’t in these flows but I did have did push the ‘members’ json into there just to play around with it.

image

And this was for the plan with the ID starting GmA – here we see the first few members of the group associated with that plan.

image

Now I have the data in blob storage there are a number of things I could do – either manually – or using Azure Data Factory to orchestrate moving this somewhere else.  I’ll admit this may well be adding more steps and process than is actually needed – and Part 3 shows a smoother approach – but one aim for me was using some of the cool new Azure capabilities just to get my head around how they work and how they could be used. 

I initially looked at the wizard approach using the preview ‘Copy Data’ option in Data Factory – and you can see I already have 6 linked services.

image

The basic stage are to enter the properties (and I just named my sample and set it to one time, then you get to choose your Source and Destination – either from the full list of data stores (which includes 3rd Party source too) or from existing connections – and I’d already created sources for my blob storage and destinations for my SQLAzure database.  In this example I’ll choose the json file of members shown above (and I’m not screenshotting all pages – there are some full tutorials out there

image

In formatting the file settings I say it is a set of objects (my members) and then I want to cross-apply nested JSON array of value.  This then presents the columns and I have removed all but displayname and id.

image

I selected my destination as SQLAzure from an existing connection, then chose my ‘Members’ table.  I can then map my schema for source and destination..

image

I take the default for other settings around performance – then I can review my settings.

image

I go to the deploy stage – and all looks good!

image

All completed ok, and I can see the results in my database.

imageimage

Obviously you wouldn’t do this file by file as I have – and the pipelines can be defined in json and applied across many files.  So for my tasks for example I would define the pipeline and how to handle the CSV file – and it could then be set to process any files that arrived in a certain blob.  Reporting against my data once it is in SQL Server would then be very straightforward using any of a number of tools.

As an example of what the pipeline might look like in its json format – here is my activity that did that copy.  There will be similar json files representing the source and destination, as well as the linked services:

{
     "name": "Activity-0-GmA-1oD6tUC27IgEEuENRWQAGrMQ-2017-07-01_json->[dbo]_[Members]",
     "linkedServiceName": null,
     "inputs": [
         {
             "name": "InputDataset-g18"
         }
     ],
     "outputs": [
         {
             "name": "OutputDataset-g18"
         }
     ],
     "type": "Copy",
     "policy": {
         "concurrency": 1,
         "timeout": "1.00:00:00",
         "retry": 3,
         "delay": "00:00:00",
         "executionPriorityOrder": "NewestFirst",
         "longRetry": 0,
         "longRetryInterval": "00:00:00",
         "style": "StartOfInterval"
     },
     "scheduler": {
         "frequency": "Day",
         "interval": 1
     },
     "typeProperties": {
         "source": {
             "type": "BlobSource",
             "recursive": false
         },
         "sink": {
             "type": "SqlSink",
             "writeBatchSize": 0,
             "writeBatchTimeout": "00:00:00"
         },
         "translator": {
             "type": "TabularTranslator",
             "columnMappings": "id:memberId,displayname:displayName"
         }
     }

}


Take a look at Flow, and Azure Data Factory – there is some cool technology that can help you move and consolidate your various data elements, wherever they may currently sit.

There are also some very cool json capabilities in SQL Server 2016 if you hadn’t already seen them – so plenty of options for loading data into SQL even if Azure isn’t on your radar (but hopefully I’ve piqued your interest).  See the Getting Started information if you want to have a play – with $200 free credit - https://azure.microsoft.com/en-us/get-started/ – and plenty of free options.

There are a few limitations on the data you can get out via Flow right now but there are further triggers and activities planned. (OK, pun intended Smile).  For Part 3 I did a swap back to PowerShell from the initial idea of Python (only driven by my inexperience with Python rather than real language capabilities) – and used Azure Functions for a serverless approach to running the scripts – with output to Azure Cosmos DB using REST (I’d used the Data Transfer utility in my Ready session).  For reporting I had Flow pushing some data into SharePoint lists – and connection to Cosmos DB from Power BI.  Stay tuned!

Microsoft Planner: Considerations for Reporting-Part 3

$
0
0

For the third and final part of this short series on reporting against Planner data I will be taking some of the ideas from Part 1 and Part 2 and putting more process around them.  This is still really just ideas to make you think about your use of Planner – and not a baked solution.  I don’t for example handle the messy stuff of keeping everything in sync – and take an approach of loading a snapshot each time.  Also the tools I’m using may not be the best ones for you – but hopefully highlight the open nature of Graph and also showcase the parts that Flow and many of the Azure workloads can play in an overall solution.

This is one of the outcomes of the blog – just to keep you reading…  A Power BI report of the status of Planner tasks – for a specific set of Plans.

image

For the session at our internal Ready conference that sparked this series I used Python for some of the Graph reading – based on this sample - https://github.com/microsoftgraph/python3-connect-rest-sample which was a web app using Flask the that started the authentication and then I just made the necessary changes to the requested permission scopes then made the various Graph calls and manipulated the response.  I finally used the Azure DocumentDB Data Migration tool to push the data into Azure Cosmos DB.  It felt a little clunky. 

Since then I’ve re-worked things and moved back from Python to PowerShell (mainly due to my competencies – still learning Python – come to that still learning PowerShell!) and combined Flow and Azure Functions together to move the data via REST calls into Azure Cosmos DB.

So to start – I have an Azure Cosmos DB provisioned and a database created called Planner.  I decided to store data in various collections – matching up to the data elements in Planner – with the addition of an extra one for Assignments – pulling this outside the Task for easier reporting.  I could have saved all the different json into just one collection as CosmosDB is a NoSQL store – but again for reporting and hitting it with PowerBI the collections made it easier.  In my testing I was re-loading the data many times – so I made things easier by using the Cloud Shell and a script to delete and create the collections (first time I’d used vi for a while…).

image

The next part was the Azure Functions – which read all the Planner data using the Graph API and then write the data to CosmosDB.  Initially I did this in one function, but even with just 30 or so plans I was getting close to the function time-out of 5 minutes so changed it so that the initial function reads most of the plan level data and then makes another function call for each of the plans to get the task data.  This could probably be improved to pass over the token information too – avoiding some re-authentication in the 2nd function.

You pay for Azure functions based on the time they are running and the resources they consume – and storage while they are not running.  I’ve yet to hit $0.05 on my usage during my testing for the function costs.

The first function has some setup stuff for the CosmosDB REST calls, and the authentication key is created based on the verb, resource date and the master key – so these are needed for each collection.  There is also some setup for the Planner Graph calls (including keys/accounts/passwords as this runs from the function) and then imports some modules for the Graph auth.  These modules need to be loaded into the function – which is carried out using the right hand ‘View files’ window.

image

I’ll add the full script at the foot of the blog – just to not break the flow of this document too much and just walk through some key parts.  In earlier version I had used the me/Planner/Plans Graph call to get the Plans, but there are some challenges with that call in that it only gets the subscribed plans – so I switched in this example and used the v1.0/Groups call (as the account I had set had access) and then used the groups/<GroupId>/planner/plans call.  This has the additional advantage that it does get the potentially multiple plans per Group that can be created via Microsoft Teams – hence my use of a ForEach when reading the plans from the Group.

One other gotcha I hit when running from the function compared to just running in PowerShell on my desktop was seeing some ‘bad json’ messages.  Adding a –Compress to the ‘ConvertTo-Json appeared to resolve these, and I suspect either trailing spaces or ‘\’ may have been the root cause – exposed by a different PowerShell version.  I also wrapped the planner loop in a try/catch – as some groups may not have plans.

After looping around the plans, plan details and members, and writing each of the sets of data off to CosmosDB using a REST call like in the following screenshot I made the call to the second function – shown in the subsequent screenshot.  You can also see in the PlanDetails section that this is where I am filtering the Plans and only pushing all the remaining data to CosmosDB if the 1st category is set to Marketing.  This is a pretty rough way to filter – but it does the job in my case as all the plans I am interested in were created from a template where category 1 was marketing.

image

And here is the call to the subsequent function.  I am passing in the Plan Id, and using Start-Job so this is an asynchronous call so I’m not waiting on a response.  This also necessitates the use of $using so that the async call has access to the local variables.  You probably want to add some kind of error handling here – or just keep an eye on function failures with an alert.  Notice the –UseBasicParsing too – as this is required since the call is being made in the function and the Internet Explorer engine is not available:

image

The second function called above is very similar once we get past the initial code that gets the Plan ID from the request.

image

I proceed with the REST and Planner auth stuff after that (which perhaps could be passed in the request and save some time/money), and then one added complexity was pulling out the assignments into their own collection – so it is easier to look at assignments across plans.

image

In my case I’ve just executed the first function manually for now, but it would make sense to have this on a schedule – and you could also tie in the deletion/creation of collections to save on the costs between reporting cycles.  If I hadn’t wanted to learn more about CosmosDB it would probably make more sense to push this data into SQL Azure – but the very cool multi geo replication possibilities with CosmosDB are certainly worth a look.  I did start spending a bit more of my Azure allocation than I’d expected after creating collections with high throughput capabilities – and writing my data to Europe which then replicated to the US and Brazil.  Lesson learned.

Monitoring the functions through the Azure Portal I can see that my current run of the initial function was successful and took just over 2 minutes.  I left some of my earlier failures in there too – the debug capabilities in Functions are pretty good!

image

This of course was then calling out to the other function – so monitoring that one I also see success (just showing the last few here) and they were taking around 15 to 20 seconds a time.

image

Here is an example of one record – part of a task – and as you can see this is stored in json format, 

image

As I get this far I’m thinking that a Part 4 of my trilogy will be needed to dig deeper into using Power BI directly against the CosmosDB – as for now I wanted to concentrate on another Flow.  I have a scheduled Flow that runs every day and makes several calls to query documents in my CosmosDB and then creates and then updates an item in a SharePoint list.

image

The first Query documents looks like this – it is going to my Planner CosmosDB and mya Tasks collection and executing the query

SELECT Count(c.id) from c WHERE c.percentComplete=0

image

And you can test out this kind of query in the Azure portal agains the CosmosDB using the query explorer, and my answer currently is 73 tasks that have percentComplete of 0 – which means not started.:

image

The next step of the Flow is creating an item in SharePoint – and the step looks like the following, where I set the site address and the list name, then write a title (which I don’t actually use) and set the date to now and use an expression to write out my query result of 73.

image

The expression to get the value from my query was the tricky part for me – but finally this worked:

int(body('Query_documents')?['Documents']?[0]?['$1'])

The remaining steps are just a repeat of the query for 50% (In Progress) and 100% (Completed) and updating the task (using the ID) with the appropriate value.

Looking at my Flow run history I can see that I had a few failures – these were days when my collections didn’t exist.  The two recent successes where just after I added my collections while writing this blog (result = 0 for all fields! – so I deleted this row in SharePoint) and the most recent successful one where it added my correct values.

In SharePoint my list looks like this:

image

I must admit to cheating here and adding a bit of history manually – so that the PowerBI reports would look good – but after pointing Power BI at my SharePoint list, removing columns I wasn’t interested in and ensuring that the remaining data was being seen as a Date field and 3 integers – I could finally get to the report I showed at the top of the blog – but now with the additional values from today (I’d added a plan and updated a few progress values earlier):

image

I’ll do an addendum or Part 4 of the trilogy to look at using the Power BI CosmosDB direct reporting options – but hopefully this has given you some ideas.  I’ve probably skipped over some important point or other – let me know if anything needs some better explanation.

The PowerShell scripts used in the two functions follow – with some details edited where you will need to add your own accounts, keys and Urls etc.  These are supplied as-is – no warranties.  Bware of “ being turned curly and new lines where there shouldn’t be new lines.

The Initial Function

#Setup stuff for the DocDB REST calls

[System.Reflection.Assembly]::LoadWithPartialName("System.Web") | out-null

$global:resourceType="docs"

$verb="POST"

$date = Get-Date

$utcDate = $date.ToUniversalTime()

$global:xDate = $utcDate.ToString('r',[System.Globalization.CultureInfo]::InvariantCulture)

$global:masterKey = "<Your master key from CosmosDB goes here>”

# generate signature for REST call

function fnGetKey ($pVerb, $pResourceId, $pResourceType, $date, $masterKey)

{
   $keyBytes = [System.Convert]::FromBase64String($masterKey)
   $sigCleartext = @($pVerb.ToLower() + "`n" + $pResourceType.ToLower() + "`n" + $pResourceId + "`n" + $date.ToString().ToLower() + "`n" + "" + "`n")
   $bytesSigClear =[Text.Encoding]::UTF8.GetBytes($sigCleartext)
   $hmacsha = new-object -TypeName System.Security.Cryptography.HMACSHA256 -ArgumentList (,$keyBytes)
   $hash = $hmacsha.ComputeHash($bytesSigClear) 
   $signature = [System.Convert]::ToBase64String($hash)
   $key = $('type=master&ver=1.0&sig=' + $signature)
   $key  = [System.Web.HttpUtility]::UrlEncode($('type=master&ver=1.0&sig=' + $signature)) # needs Snapin System.web!
   return $key

}

#Setup stuff for the Planner Calls

$clientId = ‘<A registered client ID in Azure with the right access levels>’

$aadtenant = "<yourtenant>.onmicrosoft.com"

$username = "<yourname>@<yourtenant>.onmicrosoft.com"

$password = "<your password>" | ConvertTo-SecureString -AsPlainText -Force

$Credential = New-Object -typename System.Management.Automation.PSCredential -argumentlist $username, $password
   

$Modulebase = (get-Module MicrosoftGraphAPI).ModuleBase

Import-Module "D:\home\site\wwwroot\HttpTriggerPlannerToDocDB\Microsoft.IdentityModel.Clients.ActiveDirectory.dll"
  

Import-Module "D:\home\site\wwwroot\HttpTriggerPlannerToDocDB\Microsoft.IdentityModel.Clients.ActiveDirectory.WindowsForms.dll"
   

$resourceAppIdURI = “https://graph.microsoft.com”
   

$authority = “https://login.windows.net/$aadTenant”
   

$authContext = New-Object "Microsoft.IdentityModel.Clients.ActiveDirectory.AuthenticationContext" -ArgumentList $authority

$uc = new-object Microsoft.IdentityModel.Clients.ActiveDirectory.UserCredential -ArgumentList $Credential.Username,$Credential.Password

$token = $authContext.AcquireToken($resourceAppIdURI, $clientId,$uc)

# Blog Client ID - my Application ID from Azure Oops – dupe line…

$clientId = ‘<A registered client ID in Azure with the right access levels>’

#Get DocDB REST authkeys

$global:resourceId= "dbs/Planner/colls/Plans"

$planAuthKey = fnGetKey $verb $resourceId $resourceType $xdate $masterKey

$global:resourceId= "dbs/Planner/colls/PlanDetails"

$planDetailsAuthKey = fnGetKey $verb $resourceId $resourceType $xdate $masterKey

$global:resourceId= "dbs/Planner/colls/Members"

$planMembersAuthKey = fnGetKey $verb $resourceId $resourceType $xdate $masterKey

$global:resourceId= "dbs/Planner/colls/Buckets"

$planBucketsAuthKey = fnGetKey $verb $resourceId $resourceType $xdate $masterKey

$global:resourceId= "dbs/Planner/colls/Tasks"

$planTasksAuthKey = fnGetKey $verb $resourceId $resourceType $xdate $masterKey

$global:resourceId= "dbs/Planner/colls/TaskDetails"

$planTaskDetailsAuthKey = fnGetKey $verb $resourceId $resourceType $xdate $masterKey

$global:resourceId= "dbs/Planner/colls/Assignments"

$planAssignmentsAuthKey = fnGetKey $verb $resourceId $resourceType $xdate $masterKey

#################################################

#Get Plans (V2 - from Groups)

#################################################

$headers = @{}

$headers.Add('Authorization','Bearer ' + $Token.AccessToken)

$headers.Add('Content-Type', "application/json")



#$plans = Invoke-WebRequest -Uri "https://graph.microsoft.com/v1.0/me/planner/plans" -Method Get -Headers $headers -UseBasicParsing

#$plansContent = $plans.Content | ConvertFrom-Json

#$planValue = $plansContent.Value

#First get Groups

$groups = Invoke-WebRequest -Uri "https://graph.microsoft.com/v1.0/Groups" -Method Get -Headers $headers -UseBasicParsing

$groupsContent = $groups.Content | ConvertFrom-Json

$groupValue = $groupsContent.Value

ForEach($group in $groupValue){
     try {

$uri = "https://graph.microsoft.com/v1.0/groups/" + $group.id + "/planner/plans"

$plans = Invoke-WebRequest -Uri $uri -Method Get -Headers $headers -UseBasicParsing

$plansContent = $plans.Content | ConvertFrom-Json

$planValue = $plansContent.Value

ForEach($plan in $planValue){
        
     $docdbHeader=@{"Authorization" = "$planAuthKey"; `
     "x-ms-version" = "2015-12-16"; `
     "x-ms-date" = "$xdate"; `
     "x-ms-indexing-directive" = "Include"; `
     "x-ms-documentdb-is-upsert" = "True"; `
     "Content-Length" = "0"
     }

$jsonBody = $plan | ConvertTo-Json -Compress

$docdbUri = "https://<YourCosmosDBName>.documents.azure.com/dbs/Planner/colls/Plans/docs"

Invoke-RestMethod -Uri $docdbUri -Headers $docdbHeader -Method $verb -ContentType "application/json" -Body $jsonBody



#################################################

#Get Plan Details

#################################################



$groupId = $plan.owner

$uri = "https://graph.microsoft.com/v1.0/planner/plans/" + $plan.id + "/details"

$planDetails = Invoke-WebRequest -Uri $uri -Method Get -Headers $headers -UseBasicParsing  

$planDetailsContent = $planDetails.Content | ConvertFrom-Json

# Only continue with this plan if the category1 matches our target paln type.

if($planDetailsContent.categoryDescriptions.category1 -eq "Marketing"){

$docdbHeader=@{"Authorization" = "$planDetailsAuthKey"; `
     "x-ms-version" = "2015-12-16"; `
     "x-ms-date" = "$xdate"; `
     "x-ms-indexing-directive" = "Include"; `
     "x-ms-documentdb-is-upsert" = "True"; `
     "Content-Length" = "0"
     }

$jsonBody = $planDetailsContent | ConvertTo-Json -Compress

$docdbUri = "https://<yourCosmosDBName>.documents.azure.com/dbs/Planner/colls/PlanDetails/docs"

Invoke-RestMethod -Uri $docdbUri -Headers $docdbHeader -Method $verb -ContentType "application/json" -Body $jsonBody

#GroupMember

$uri = "https://graph.microsoft.com/v1.0/groups/" + $groupId + "/members"

$members = Invoke-WebRequest -Uri $uri -Method Get -Headers $headers -UseBasicParsing

$membersContent = $members.Content | ConvertFrom-Json

$membersContent | Add-Member id $plan.id

$membersValue = $membersContent.value

$docdbHeader=@{"Authorization" = "$planMembersAuthKey"; `
     "x-ms-version" = "2015-12-16"; `
     "x-ms-date" = "$xdate"; `
     "x-ms-indexing-directive" = "Include";
     "x-ms-documentdb-is-upsert" = "True"; `
     "Content-Length" = "0"
     }

ForEach($member in $membersValue){

$jsonBody = $member | ConvertTo-Json -Compress

$docdbUri = "https://<yourCosmosDBName>.documents.azure.com/dbs/Planner/colls/Members/docs"

Invoke-RestMethod -Uri $docdbUri -Headers $docdbHeader -Method $verb -ContentType "application/json" -Body $jsonBody

}

#Buckets

$uri = "https://graph.microsoft.com/v1.0/planner/plans/" + $plan.id + "/buckets"

$buckets = Invoke-WebRequest -Uri $uri -Method Get -Headers $headers -UseBasicParsing

$bucketsContent = $buckets.Content | ConvertFrom-Json

$bucketsContent | Add-Member id $plan.id

$bucketsValue = $bucketsContent.value

$docdbHeader=@{"Authorization" = "$planBucketsAuthKey"; `
     "x-ms-version" = "2015-12-16"; `
     "x-ms-date" = "$xdate"; `
     "x-ms-indexing-directive" = "Include";
     "x-ms-documentdb-is-upsert" = "True"; `
     "Content-Length" = "0"
     }

ForEach($bucket in $bucketsValue){

$jsonBody = $bucket | ConvertTo-Json -Compress

$docdbUri = "https://<yourCosmosDBName>.documents.azure.com/dbs/Planner/colls/Buckets/docs"

Invoke-RestMethod -Uri $docdbUri -Headers $docdbHeader -Method $verb -ContentType "application/json" -Body $jsonBody

}

#Tasks - call out to separate function

$planId = $plan.id

$taskBody=@{"planId" = "$planId"}

$body=$taskBody | ConvertTo-Json -Compress

$taskHeaders = @{}

$taskHeaders.Add('Content-Type', "application/json")

$uri = "https://<yourfunctionendpoint>.azurewebsites.net/api/<YourfunctionName>?code=<from your function Url>”

Start-Job {Invoke-WebRequest -Uri $using:uri -Method Post -Headers $using:taskHeaders -Body $using:body -UseBasicParsing}

}

}

}

catch [Net.WebException] {}

}    

The 2nd Function – called from the last part of the one above:

# POST method: $req

$requestBody = Get-Content $req -Raw | ConvertFrom-Json

$planId = $requestBody.planId

# GET method: each querystring parameter is its own variable

if ($req_query_name)

{
     $planId = $req_query_name

}

#Setup stuff for the DocDB REST calls

[System.Reflection.Assembly]::LoadWithPartialName("System.Web") | out-null

$global:resourceType="docs"

$verb="POST"

$date = Get-Date

$utcDate = $date.ToUniversalTime()

$global:xDate = $utcDate.ToString('r',[System.Globalization.CultureInfo]::InvariantCulture)

$global:masterKey = "<YourCosmosDB master key>”

# generate signature for REST call

function fnGetKey ($pVerb, $pResourceId, $pResourceType, $date, $masterKey)

{
   $keyBytes = [System.Convert]::FromBase64String($masterKey)
   $sigCleartext = @($pVerb.ToLower() + "`n" + $pResourceType.ToLower() + "`n" + $pResourceId + "`n" + $date.ToString().ToLower() + "`n" + "" + "`n")
   $bytesSigClear =[Text.Encoding]::UTF8.GetBytes($sigCleartext)
   $hmacsha = new-object -TypeName System.Security.Cryptography.HMACSHA256 -ArgumentList (,$keyBytes)
   $hash = $hmacsha.ComputeHash($bytesSigClear) 
   $signature = [System.Convert]::ToBase64String($hash)
   $key = $('type=master&ver=1.0&sig=' + $signature)
   $key  = [System.Web.HttpUtility]::UrlEncode($('type=master&ver=1.0&sig=' + $signature)) # needs Snapin System.web!
   return $key

}

#Setup stuff for the Planner Calls

$clientId = ‘<A registered client ID in Azure with the right access levels>’

$aadtenant = "<yourtenant>.onmicrosoft.com"
$username = "<yourname>@<yourtenant>.onmicrosoft.com"
$password = "<your password>" | ConvertTo-SecureString -AsPlainText -Force

$Credential = New-Object -typename System.Management.Automation.PSCredential -argumentlist $username, $password
   

$Modulebase = (get-Module MicrosoftGraphAPI).ModuleBase

Import-Module "D:\home\site\wwwroot\HttpTriggerPlannerToDocDB\Microsoft.IdentityModel.Clients.ActiveDirectory.dll"
  

Import-Module "D:\home\site\wwwroot\HttpTriggerPlannerToDocDB\Microsoft.IdentityModel.Clients.ActiveDirectory.WindowsForms.dll"
   

$resourceAppIdURI = “https://graph.microsoft.com”
   

$authority = “https://login.windows.net/$aadTenant”
   

$authContext = New-Object "Microsoft.IdentityModel.Clients.ActiveDirectory.AuthenticationContext" -ArgumentList $authority

$uc = new-object Microsoft.IdentityModel.Clients.ActiveDirectory.UserCredential -ArgumentList $Credential.Username,$Credential.Password

$token = $authContext.AcquireToken($resourceAppIdURI, $clientId,$uc)

# Blog Client ID - my Application ID from Azure

$clientId = ‘<A registered client ID in Azure with the right access levels>’

#Get DocDB REST authkeys

$global:resourceId= "dbs/Planner/colls/Tasks"

$planTasksAuthKey = fnGetKey $verb $resourceId $resourceType $xdate $masterKey

$global:resourceId= "dbs/Planner/colls/TaskDetails"

$planTaskDetailsAuthKey = fnGetKey $verb $resourceId $resourceType $xdate $masterKey

$global:resourceId= "dbs/Planner/colls/Assignments"

$planAssignmentsAuthKey = fnGetKey $verb $resourceId $resourceType $xdate $masterKey

#Get Tasks stuff for planId

$headers = @{}

$headers.Add('Authorization','Bearer ' + $Token.AccessToken)

$headers.Add('Content-Type', "application/json")

$uri = "https://graph.microsoft.com/v1.0/planner/plans/" + $planId + "/tasks"

$tasks = Invoke-WebRequest -Uri $uri -Method Get -Headers $headers -UseBasicParsing

$tasksContent = $tasks.Content | ConvertFrom-Json

$tasksValue = $tasksContent.value

$docdbTaskHeader=@{"Authorization" = "$planTasksAuthKey"; `
     "x-ms-version" = "2015-12-16"; `
     "x-ms-date" = "$xdate"; `
     "x-ms-indexing-directive" = "Include";
     "x-ms-documentdb-is-upsert" = "True"; `
     "Content-Length" = "0"
     }

ForEach($task in $tasksValue){

$assignmentNames=@{}

$jsonBody = $task | ConvertTo-Json -Compress

$docdbUri = "https://<YourCosmosDBName>.documents.azure.com/dbs/Planner/colls/Tasks/docs"

Invoke-RestMethod -Uri $docdbUri -Headers $docdbTaskHeader -Method $verb -ContentType "application/json" -Body $jsonBody

ForEach($assignment in ($task.assignments | Get-Member -MemberType NoteProperty)){

[array]$assignmentNames +=$assignment

}

ForEach($assignmentName in $assignmentNames){

$assignee=@{}

If($assignmentName.Name){

$assignee | Add-Member name $assignmentName.Name.ToString()

}

Else{

$assignee | Add-Member name "Unassigned"

}

$assignee | Add-Member planId $planId 

$assignee | Add-Member taskId $taskId

$uniqueAssnId = $assignee.planId + "_" + $assignee.taskId + "_" + $assignee.name

$assignee | Add-Member id $uniqueAssnId

$docdbAssnHeader=@{"Authorization" = "$planAssignmentsAuthKey"; `
     "x-ms-version" = "2015-12-16"; `
     "x-ms-date" = "$xdate"; `
     "x-ms-indexing-directive" = "Include";
     "x-ms-documentdb-is-upsert" = "True"; `
     "Content-Length" = "0"
     }

$jsonBody = $assignee | ConvertTo-Json -Compress

$docdbUri = "https://<YourCosmosDBName>.documents.azure.com/dbs/Planner/colls/Assignments/docs"

Invoke-RestMethod -Uri $docdbUri -Headers $docdbAssnHeader -Method $verb -ContentType "application/json" -Body $jsonBody

}



}

#Task Details

ForEach($task in $tasksValue){
     $uri = "https://graph.microsoft.com/v1.0/planner/tasks/" + $task.id + "/details"

    $taskDetails = Invoke-WebRequest -Uri $uri -Method Get -Headers $headers -UseBasicParsing
     $taskDetailsContent = $taskDetails.Content | ConvertFrom-Json
    
     $docdbTaskDetailsHeader=@{"Authorization" = "$planTaskDetailsAuthKey"; `
         "x-ms-version" = "2015-12-16"; `
         "x-ms-date" = "$xdate"; `
         "x-ms-indexing-directive" = "Include"; `
         "x-ms-documentdb-is-upsert" = "True"; `
         "Content-Length" = "0"
         }
        
     $jsonBody = $taskDetailsContent | ConvertTo-Json -Compress

    $docdbUri = "https://<YourCosmosDBName>.documents.azure.com/dbs/Planner/colls/TaskDetails/docs"

    Invoke-RestMethod -Uri $docdbUri -Headers $docdbTaskDetailsHeader -Method $verb -ContentType "application/json" -Body $jsonBody
     }

Viewing all 200 articles
Browse latest View live


<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>