Wednesday, May 31, 2017

Consuming data from On-premises into Office 365

Office 365 is a great platform for any organization size. It provides you as the user access to all kinds of applications and services that can help and benefit your organization and business processes. One of the questions I get asked a lot is about consuming On-premises data from a line of business application up into Office 365.

read more


by via SharePoint Pro

Tuesday, May 30, 2017

IT/DEV Connections: Making you a better SharePoint Developer

Many years ago, in a previous role, I opted to join the consultancy team, specifically to work with a new product due to come out at some point in 2001. At the time, it was available to those that worked directly with Microsoft on projects. I was working at the Microsoft campus in Reading, UK, and for the project I was on got access to nightly builds which would become SharePoint Portal Server 2001, as that project expanded I got to do the same for SharePoint 2003 as well. I would install and test it, repeating that cycle almost daily.

read more


by via SharePoint Pro

Tuesday, May 23, 2017

SharePoint and Office 365: The New Beautiful Cookbook Series

Most of us are “meat and potatoes” people when it comes to the technology we use. We like what we know and we know what we like. (Yes, there are vegan “seitan and potatoes” people, vegetarian “sprouts and potatoes” people, pescatarian “cod and potatoes” people, etc. I’m not trying to leave anyone out.)

Every once in a while, though, someone hands us a new ingredient – something we’ve never seen before, something we’ve never cooked with.

Image from the Netflix show Chef’s Table S3E6 – Virgilio Martinez

That new ingredient becomes a part of our pantry, and we want to try to cook with it. We’ve probably heard how delicious it is or how it can make an ordinary dish taste amazing.

Sometimes, we get a whole new palette of ingredients. (Many of us love to watch cooking shows for just this reason: we see novel dishes and decide if we’d like to try them at home.)

Image from the Netflix show Chef’s Table S3E6 – Virgilio Martinez

We need to take a ton of time to figure out what the new ingredients are, how we can work with them, and what we can cook. If we don’t cook with the ingredients pretty often, then we lose the knowledge of how to use them, what ripeness is best.

Writing off something because it tastes bad in one context means we may miss a great use of the ingredient later – a ripe plum tastes so much better than an unripe one. Once someone has eaten an unripe plum, they may decide they hate plums.

But if we can overcome these hurdles and learn about the new ingredients, we can make some incredible dishes.

Image from the Netflix show Chef’s Table S3E6 – Virgilio Martinez

This is what I think we are going through with SharePoint and Office 365 right now. Microsoft is offering us an entirely new set of ingredients with which to make our stew.

Let me give you an example…

In my “meat and potatoes” way of looking at the world, which has been pretty consistent for the last ten years or so, even though SharePoint and my approaches have evolved, I might use this set of ingredients:

  • A Single Page Application (SPA) written with AngularJS or KnockoutJS – or even just plain old JavaScript
  • A dollop of values passed on the query string to a…
  • Standard list form, with a little JavaScript mixed in to pre-populate some columns in the form
  • A SharePoint Designer workflow to add notifications on top (Substitute Alerts if your local market doesn’t carry SharePoint Designer)

But there are new ingredients now. Instead we could whip something up with these:

  • A SharePoint Framework Web Part (still maybe written with AngularJS or KnockoutJS)
  • Creating list items using REST based on the values in our SPFx Web Part
  • Microsoft Flow to add in the notifications and any process
  • Stir in a pinch of PowerApps – until they are ready

That’s quite a shift. We’re being asked to think about cooking in a very different way. We’ve been through stages of evolution before – new cooking techniques like sous vide (Sandbox Solutions), gelification (Add-In Model, nee App Model), etc. – but this time it’s really different. We’re not even sure if we’re supposed to like everything we taste. Is it just the next wave of kale frenzy or is it an ingredient that will last?

At this point, Microsoft is asking us to dream big, and reach for the previously unimaginable. I think we need to try to do it.

Image from the Netflix show Chef’s Table S3E6 – Virgilio Martinez

Some of us will be able to cook up truly amazing solutions on the “modern” platform. Don’t be afraid to give it a taste.

Image from the Netflix show Chef’s Table S3E6 – Virgilio Martinez

In case you didn’t figure it out, this post was inspired by the Netflix show Chef’s Table S3E6, which profiles the Peruvian chef Virgilio Martinez. It’s an outstanding series, and this particular episode was stellar.


by Marc D Anderson via Marc D Anderson's Blog

Wednesday, May 17, 2017

The Ultimate Script to download Microsoft Build 2017 Videos AND slides!

Download Build 2017

With the amount of great sessions at Build 2017 this year, there is no way you could have attended all, and even if they are posted on Channel9, you might want to download them to be able to view them offline! That is why I created this PowerShell Script so everyone can easily download Microsoft Build Videos AND slides whether they were present at Build or not! Here are the features:

  • Downloads all the Microsoft Build 2017 Sessions and Slides and description in a text file
  • Groups them by folders
  • Makes sure no errors come up due to Illegal File names.
  • If you stop the script and restart in the middle, it will start where it left off and not from beginning
  • Filter by keywords in the session title!
  • Ability to choose from HD videos, or lower quality.

Download the script from here! Do not copy paste from below as sometimes WordPress messes up the PowerShell!

How to use:

First, make sure to change the $downloadlocation variable. By default it saves it all in C:\Build

To download all sessions just run the script! (1TB of content in High Definition)  EX:

.\ DownloadBuild2017Content.ps1

To download sessions based on a keyword use the keyword parameters, and divide keywords by a comma. Make sure to use quotes around the keywords! EX:

.\ DownloadBuild2017Content.ps1 -keyword "Graph,Bluetooth"

To download sessions based on the session code, use the session parameter and divide sessions by a comma. Make sure to use quotes around the session codes!

.\DownloadBuild2017Content.ps1 -session "P4116,P4118,C9R04"

Note: By default, the videos are downloaded in High Definition. Downloading all the Build Videos in HD takes a lot of hard drive space. You can change the script (comment lines 18 and 19, and uncomment 21/22) in order to download the lower quality version of the videos.

Follow me on Social Media and Share this article with your friends!


Leave a comment and don’t forget to like the Absolute SharePoint Blog Page   on Facebook and to follow me on Twitter here  for the latest news and technical articles on SharePoint.  I am also a Pluralsight author, and you can view all the courses I created on my author page.

Here is the source code:

#Script written by Vlad Catrinescu
#Visit my site http://ift.tt/1eCEHiK
#Twitter: @vladcatrinescu
#Originally Posted here: http://ift.tt/2rr9kz9

Param(
  [string]$keyword,[string]$session
)


######    Variables  #####

#Location - Preferably enter something not too long to not have filename problems! cut and paste them afterwards
$downloadlocation = "C:\Build2017"
#Ignite 2016 Videos RSS Feed
[Environment]::CurrentDirectory=(Get-Location -PSProvider FileSystem).ProviderPath 
$rss = (new-object net.webclient)
$video1 = ([xml]$rss.downloadstring("http://ift.tt/2qsELM8"))
$video2 = ([xml]$rss.downloadstring("http://ift.tt/2rr3HRw")) 
#other qualities for the videos only. Uncomment below and delete the two previous lines to download Mid Quality videos
#$video1 = ([xml]$rss.downloadstring("http://ift.tt/2qsCfFA"))
#$video2 = ([xml]$rss.downloadstring("http://ift.tt/2rr3FJo"))
$slide1 = ([xml]$rss.downloadstring("http://ift.tt/2qsOWAb"))
$slide2 = ([xml]$rss.downloadstring("http://ift.tt/2rrjVKg"))



#SCRIPT/ Functions  Do not touch below this line :)#
if (-not (Test-Path $downloadlocation)) { 
                Write-Host "Folder $fpath dosen't exist. Creating it..."  
                New-Item $downloadlocation -type directory | Out-Null
        }
set-location $downloadlocation

function CleanFilename($filename)
{
    return $filename.Replace(":", "-").Replace("?", "").Replace("/", "-").Replace("<", "").Replace("|", "").Replace('"',"").Replace("*","")
}

function DownloadSlides($filter,$videourl)
{
    try 
    {    
        $videourl.rss.channel.item | Where{($_.title -like “*$filter*”) -or ($_.link -like "*/$filter")} | 
        foreach {
                $code = $_.comments.split("/") | select -last 1    
        
                # Grab the URL for the PPTX file
                $urlpptx = New-Object System.Uri($_.enclosure.url)  
            $filepptx = $code + "-" + $_.creator + "-" + (CleanFileName($_.title))
                $filepptx = $filepptx.substring(0, [System.Math]::Min(120, $filepptx.Length))
                $filepptx = $filepptx.trim()
                $filepptx = $filepptx + ".pptx" 
                if ($code -ne "")
                {
                         $folder = $code + " - " + (CleanFileName($_.title))
                         $folder = $folder.substring(0, [System.Math]::Min(100, $folder.Length))
                         $folder = $folder.trim()
                }
                else
                {
                        $folder = "NoCodeSessions"
                }
        
                if (-not (Test-Path $folder)) { 
                        Write-Host "Folder $folder dosen't exist. Creating it..."  
                        New-Item $folder -type directory | Out-Null
                }

                # Make sure the PowerPoint file doesn't already exist
                if (!(test-path "$downloadlocation\$folder\$filepptx"))     
                {       
                        # Echo out the  file that's being downloaded
                        write-host "Downloading slides: $filepptx"
                        #$wc = (New-Object System.Net.WebClient)  

                        # Download the MP4 file
                        #$wc.DownloadFile($urlpptx, "$downloadlocation\$filepptx")
                Start-BitsTransfer $urlpptx "$downloadlocation\$filepptx" -DisplayName $filepptx
                        mv $filepptx $folder 

                }
            else
            {
                        write-host "Slides exist: $filepptx"
            }
            }

     }
    
    catch
    {
        $ErrorMessage = $_.Exception.Message
        Write-host "$ErrorMessage"
    }
}


function DownloadVideos($filter,$slideurl)
{
#download all the mp4
# Walk through each item in the feed 
$slideurl.rss.channel.item | Where{($_.title -like “*$filter*”) -or ($_.link -like "*/$filter*")} | foreach{   
        $code = $_.comments.split("/") | select -last 1    
        
        # Grab the URL for the MP4 file
        $url = New-Object System.Uri($_.enclosure.url)  
        
        # Create the local file name for the MP4 download
        $file = $code + "-" + $_.creator + "-" + (CleanFileName($_.title))
        $file = $file.substring(0, [System.Math]::Min(120, $file.Length))
        $file = $file.trim()
        $file = $file + ".mp4"  
        
        if ($code -ne "")
        {
                 $folder = $code + " - " + (CleanFileName($_.title))
                 $folder = $folder.substring(0, [System.Math]::Min(100, $folder.Length))
                 $folder = $folder.trim()
        }
        else
        {
                $folder = "NoCodeSessions"
        }
        
        if (-not (Test-Path $folder)) { 
                Write-Host "Folder $folder) dosen't exist. Creating it..."  
                New-Item $folder -type directory | Out-Null
        }
        
        
        
        # Make sure the MP4 file doesn't already exist

        if (!(test-path "$folder\$file"))     
        {       
                # Echo out the  file that's being downloaded
                write-host "Downloading video: $file"
                #$wc = (New-Object System.Net.WebClient)  

                # Download the MP4 file
                Start-BitsTransfer $url "$downloadlocation\$file" -DisplayName $file
                mv $file $folder
        }
    else
    {
                write-host "Video exists: $file"
    }

#text description from session
        $OutFile = New-Item -type file "$($downloadlocation)\$($Folder)\$($Code.trim()).txt" -Force  
    $Category = "" ; $Content = ""
    $_.category | foreach {$Category += $_ + ","}
    $Content = $_.title.trim() + "`r`n" + $_.creator + "`r`n" + $_.summary.trim() + "`r`n" + "`r`n" + $Category.Substring(0,$Category.Length -1)
   add-content $OutFile $Content
                
        }
}



if ($keyword)
{
    $keywords = $keyword.split(",")
    
    foreach ($k in $keywords)
    {
        $k.trim()
        Write-Host "You are now downloading the sessions with the keyword $k"
        DownloadSlides $k $slide1
        DownloadSlides $k $slide2
        DownloadVideos $k $video1
        DownloadVideos $k $video2
    }
}
elseif ($session)
{
    $sessions = $session.Split(",")
    
    foreach ($s in $sessions)
    {
        $s.trim()
        Write-Host "You are now downloading the session $s"
        DownloadSlides $s $slide1
        DownloadSlides $s $slide2
        DownloadVideos $s $video1
        DownloadVideos $s $video2
    }

}
else
{
    DownloadSlides " " $slide1
    DownloadSlides " " $slide2
    DownloadVideos " " $video1
    DownloadVideos " " $video2
}

 

The post The Ultimate Script to download Microsoft Build 2017 Videos AND slides! appeared first on Absolute SharePoint Blog by Vlad Catrinescu.


by Vlad Catrinescu via Absolute SharePoint Blog by Vlad Catrinescu

Let’s Capture Missing or Insufficient SharePoint REST Endpoints

Today I got an alert that the SharePoint UserVoice suggestion from Corey Roth (@coreyroth) entitled Add managed metadata term store operations to REST API got the coveted “Thinking About It” tag from the Product Group. I like to tweet out changes like this to let people know the Product Group is listening and acting on our feedback – beyond saying “That’s good feedback!” It’s not all wine and roses, though:

Thank you for your feedback! Just letting you know that we absolutely have this in our backlog, but unfortunately this currently is not included in our short term engineering tasks. We absolutely understand the request and seeing vote counts around this, will help to further prioritize this work for next sprints.

I got a couple of tweets back right away pointing out some other current holes in the REST APIs.

If you think there are other endpoints the REST APIs need or endpoints that don’t work well, please add them to the comments here. I’ll work them up into a list for the Product Group and let’s see what we can get moving! We’ll play by the rules and add the list to UserVoice, but I think all the individual suggestions get lost and it’s harder to see the bigger picture.

The list so far:

  • Managed Metadata (aka Term Sets or Taxonomy – please stop making up multiple names for things Microsoft!!!)
  • Recurring Events – I have a long post in the works explaining how I handle this with a combination of REST and SOAP (with SPServices), and it isn’t pretty
  • Recurring events via the Search endpoint (Derek Gusoff)
  • Publishing – PublishingPageContent, PublishingPageImage (@gautamsheth)
  • Starting a site workflow – StartWorkflowOnListItemBySubscriptionId & StartWorkflow are only for list items (@BradOrluk)
  • Editing/adding Property Bag values (@alexaterentiev)

by Marc D Anderson via Marc D Anderson's Blog

Tuesday, May 16, 2017

SharePoint Virtual Summit Wrap-up

Today was a great day for those of us that work within SharePoint. If you did not participate then you truly missed some great information about the future of SharePoint and Office 365. The event was similar to the May 4th event from last year, albeit completely virtual this time. 

read more


by via SharePoint Pro

Sunday, May 14, 2017

Microsoft Build 2017 - My Favorite Highlights and Announcements

Microsoft Build 2017 - My Favorite Highlights and Announcements

If you're in this sphere, I am pretty sure you've also been either attending on-site, virtually attending or at least heard about what Build 2017 from Microsoft had to offer last week.

With this post I simply want to emphasize on some of the features and announcements that Microsoft did recently that I really liked. There was a lot more happening, but these are my key takeaways. Enjoy.

Azure Cloud Shell

A while ago I tweeted about a nice way Microsoft implemented a feature-teaser in the Azure Portal:

Well, at Build 2017 they kept their promise and disclosed - and opened up - the functionality of the Azure Cloud Shell.

It offers a quick and easy way to communicate with Azure and run commands through the bash interface, which also has the Azure CLI and PowerShell cmdlets available right there at the fingertips.

Best of all? You're already signed in and authenticated, so no need for any of those things either. (Which should give you a slight chill down your spine, not to jump into production and accidentally delete all of your data clusters...)

Check it out here:
Azure Cloud Shell

Azure Snapshot Debugging

Microsoft Build 2017 - My Favorite Highlights and Announcements

Now this is a pretty cool thing. Take a snapshot of your production environment, then you debug the snapshot instead of debugging your production environment. Mind blown.

Read more:
Introducing the Snapshot Debugger preview for Azure

Production debugging your cloud has never been so simple - http://ift.tt/2rgdlHt

Session video: Snapshot debugging and profiling in Microsoft Azure: Next generation diagnostics for your in-production cloud apps

Azure Cosmos DB (formerly DocumentDB)

Microsoft Build 2017 - My Favorite Highlights and Announcements

This is the successor to DocumentDB, if you've tried that.
Azure Cosmos DB is a globally distributed, multi-model database which supports many models including key-value, documents, graphs, columns and a bunch of API's etc.

Learn more: Welcome to Azure Cosmos DB

Fluent Design System

Microsoft Build 2017 - My Favorite Highlights and Announcements

Fluent Design is a revamp of Microsoft Design Language 2 that will include guidelines for the designs and interactions used within software designed for all Windows 10 devices and platforms. The system is based on five key components: Light, Depth, Motion, Material, and Scale. - Source: http://ift.tt/2rgt3lW

Check it out: Microsoft Fluent Design System

OneDrive Files On-Demand

Microsoft Build 2017 - My Favorite Highlights and Announcements

One of the most popular UserVoice suggestions for OneDrive was submitted in July 15,2015. It has finally been heard and with the announcements at Build 2017, it appears we're finally getting selective sync or files on-demand.

Read the announcement: http://ift.tt/2pC1LEm

Azure Batch AI Training

Microsoft Build 2017 - My Favorite Highlights and Announcements

Azure Batch AI Training helps you experiment in parallel with your AI models using any framework and then train them at scale across clustered GPUs. Simply describe your job requirements and configuration to run, and we’ll handle the rest.

What can I say. I love things that scale, and I love building clusters of processing power - and making the applications better at the same time. If this is your type of thing, you should check out the Azure Batch AI Training. They're throwing it all into some Docker containers for easy virtualization and then off you go, because they also take care of any plumbing required.

In other words; Train your AI and Machine learning products and don't worry about any hardware - just do the fun stuff and Microsoft has your back on the rest!

Watch the Sessions for Free

In their continued awesome fashion, Microsoft is releasing most (if not all) recordings from the Build conference. They're available on Channel 9 for everyone to freely enjoy.

Check them out here:

http://ift.tt/2oYijLy


by Tobias Zimmergren via Zimmergren's thoughts on tech

Saturday, May 13, 2017

User Profile Photo Import from thumbnailPhoto using MIM and the SharePoint Connector

When leveraging Microsoft Identity Manager (MIM) and the SharePoint Connector for User Profile Synchronization, some customers have a requirement to import profile pictures from the thumbnailPhoto attribute in Active Directory.

This post details the correct way of dealing with this scenario, whilst retaining the principle of least privilege. The configuration that follows is appropriate for all of the following deployments:

  • SharePoint 2016, MIM 2016, and the MIM 2016 SharePoint Connector
  • SharePoint 2013, MIM 2016, and the MIM 2016 SharePoint Connector
  • SharePoint 2013, FIM 2010 R2 SP1 and the FIM 2010 R2 SharePoint Connector

    * Note: you can also use MIM or FIM 2010 R2 SP1 with SharePoint 2010, although this is not officially supported by the vendor.

Before we get started it is important to understand that if the customer requirement is to be able to import basic profile properties from Active Directory with the addition of profile photos, then MIM/FIM is almost certainly the wrong choice. SharePoint’s Active Directory Import capability alongside some simple PowerShell or a console application will deliver this functionality with significantly less capital and operational cost.

However, many customers are dealing with more complicated identity synchronization requirements and thumbnailPhoto is merely one of the elements required. Due to some bizarre behaviour of SharePoint’s ProfileImportExportService web service, previous vendor guidance on this capability has been inaccurate, and indeed yours truly has provided dubious advice on this topic in the past.

Most enterprise identity synchronization deployments have stringent requirements regarding the access levels granted to the variety of accounts used. This is just as it should be, there is no credible identity subsystem which allows more privilege than necessary to get the job done. Naturally a system which is providing a “hub” of identity data should be as secure as possible. Because of this security posture, many customers have complained about the level of access “required” by the account used within the SharePoint Connector (Management Agent). In some cases, customers, have refused to deploy or used alternative means to deal with thumbnailPhoto . It’s not a little deal at all for those customers.

 

What is the issue?

Assume that MIM Synchronization is configured using an Active Directory MA, and a SharePoint MA with the Export option for Picture Flow Direction*. The account used by the SharePoint MA is added to the SharePoint Farm Administrators group as required. We then perform an initial full synchronization. MIM Synchronization successfully exports 217 profiles to the UPA.

image

image

*Note: 218 is the Farm Administrator plus the 217 new profiles.
The Import option for Picture Flow Direction, whilst available in the UI and PowerShell, is not implemented and therefore won’t do anything.

We will however notice some rather puzzling results for the profile pictures. The Profile Pictures folder is correctly created within the My Site Host Site Collection’s User Photos Library. However only some of the profile pictures will be created, in this example 112 of them. What happened to the other 105?

image

image

The numbers will actually vary, I can run this test scenario hundreds of times (and believe me, I have!) and get different numbers each time. However, roughly half the pictures are created each time.

This is the problem which has led to incorrect guidance. It really is quite a puzzler. Obviously, some files are created, and thus logic suggests that the account which is calling the web service has the appropriate permissions. If the permissions were wrong, surely no files would be created. Alas, this is SharePoint after all, and sometimes it really isn’t worth the cycles! Bottom line there is an issue with the web service. That’s not something which can easily be resolved.

The documentation for the FIM 2010 R2 SP1 SharePoint Connector, which is the previous version of the currently shipping release, remains the best documentation available, it notes:

When you configure the management agent for SharePoint 2013, you need to specify an account that is used by the management agent to connect to the SharePoint 2013 central administration web site. The account must have administrative rights on SharePoint 2013 and on the computer where SharePoint 2013 is installed.

If the account doesn’t have full access to SharePoint 2013 and the local folders on the SharePoint computer, you might run into issues during, for example, an attempt to export the picture attribute.

If possible, you should use the account that was used to install SharePoint 2013.

 

This, to a SharePoint practitioner, is clearly poor guidance. Whilst it’s true the MA account must connect to Central Administration, that means it must be a Farm Administrator. There is no requirement for the account to have other administrative rights on the SharePoint Farm, and there is no requirement for any machine rights on any machine in the SharePoint Farm. And certainly no access to the local file system of a SharePoint server is needed. Furthermore, there is no scenario whereby the SharePoint Install account should ever be used for runtime operations of any component, anywhere, in any farm!  Of course, this is material authored by the FIM folks, and there is no reason to expect them to be entirely familiar with the identity configuration of SharePoint, especially given that the topic is confusing to most SharePoint folks as well!

When I delivered the “announce” of the MIM MA at Microsoft Ignite last fall, I made a point of this issue, by stating that the Farm Account should be used by the SharePoint MA if importing from thumbnailPhoto . This is also incorrect guidance. In my defence, at the time we had worked a couple weeks to try and get to the bottom of the issue, and ran out of time before the session. Thus, to show it all working there was little choice. It’s pretty silly to do a reveal of something if the demo doesn't work.

Using the Farm Account for anything, other than the Farm is a bad idea. In this case, it’s extremely dubious, as in a real-world deployment the account’s password will need to be known by the MIM administrator. Internal security compliance of any large corporation is simply not going to accept that.

Others have suggested that the SP MA account is added to the Full Control Web Application User Policy for the My Site Host web application. Or rather that the GrantAccessToProcessIdentity() method of the web application is used, which results in the above policy. That guidance is also inherently very bad. A large number of deployments now make use of a single Web Application, and providing Full Control to the MA account to that is patently a bad idea. Furthermore, such configuration allows unfettered access to the underlying content databases (which store the user’s My Sites remember!) and provides Site Collection Administrator and Site Collection Auditor rights on the My Site host.

 

The Workaround

So, we don’t want to use the Install Account, we don’t want to use the Farm Account, and we don’t wish to configure an unrestricted policy.

The answer to this conundrum is to configure a brand-new Permission Policy to which we will add a User Policy for the SharePoint MA account. This enables all the pictures to be created, without granting any more permissions than necessary.

image

The Grant Permissions for this policy are: Add Items, Edit Items, Delete Items, View Items and Open Site. No more, no less.

Then we add a new User Policy for the Web Application hosting the My Site host Site Collection, for the SharePoint MA account, with this Policy Level:

image

Now at this point if we perform another Full Synchronization we have a problem. As far as MIM Synchronization is concerned the previous export worked flawlessly. It thinks all the pictures are present. This is because the ProfileImportExportService didn’t report any exceptions. The failures have been lost to the great correlation ID database in the sky. Gone forever. If we search the SharePoint MA’s Connector Space within MIM, we will see the photo data present and correct. There are zero updates to make.

Of course, the idea is to correctly configure all of this before we perform the initial Full Synchronization. However, if you are following along, we can “fix” this by deleting the SharePoint MA’s Connector Space, and then performing a Full Synchronization. This will force an fresh Export to SharePoint (there is no need to delete the AD Connector Space).

Once the Full Synchronization has completed, we will see the correct number of items within the User Photos Library (one item is the Profile Pictures folder, the other 217 are images).

image

Of course, we would also need to run Update-SPProfilePhotoStore at this point to generate the three images for each profile, and delete the initial data (the files with GUIDs for filenames).

 

But wait, there is more!

As you may be aware the UPA does not fully understand Claims identifiers for internal access control. This is why we must enter Windows Classic style identifiers for UPA permissions and administrators.

Whilst we can create the new Permission Policy with Central Administration, we cannot create a new User Policy using a Windows Classic identifier. Whatever we enter in the UI will be transformed to a claims identifier. For this to work the policy must be as shown in the screenshot above (FABRIKAM\spma) – using a Classic identifier. And yes, I do feel stupid calling a NetBIOS username, “classic” but I am not in charge of naming anything :)

In order to configure the policy correctly for this use case we must use PowerShell. Which is actually just fine, because we don’t really want to be using the UI anyway. We can also combine all this work into a simple little script to create both the Permission Policy and the User Policy, as shown below.

Add-PSSnapin -Name "Microsoft.SharePoint.Powershell"

# update these vars to suit your environment
$WebAppUrl = "http://ift.tt/2qEhmbC"
$PolicyRoleName = "MIM Photo Import"
$PolicyRoleDescription = "Allows MIM SP MA to export photos to the MySite Host."
$GrantRightsMask = "ViewListItems, AddListItems, EditListItems, DeleteListItems, Open"
$SpMaAccount = "FABRIKAM\spma"
$SpMaAccountDescription = "MIM SP MA Account"


# do the work
$WebApp = Get-SPWebApplication -Identity $WebAppUrl
# Create new Permission Policy Level
$policyRole = $WebApp.PolicyRoles.Add($PolicyRoleName, $PolicyRoleDescription)
$policyRole.GrantRightsMask = $GrantRightsMask
# Create new User Policy with the specified account
$policy = $WebApp.Policies.Add($SpMaAccount, $SpMaAccountDescription)
# Configure the Permission Policy to the User Policy
$policy.PolicyRoleBindings.Add($policyRole)
# Commit
$WebApp.Update()

 

Summary

There you have it. How to use a least privilege account for the SharePoint MA, and successfully import thumbnailPhoto from Active Directory. In summary, the required steps are:

  1. Create an account in Active Directory for use by the SharePoint MA (e.g. FABRIKAM\spma)
  2. Add the account as a SharePoint Farm Administrator using Central Administration or PowerShell
  3. Create the Permission Policy and User Policy for the account using the PowerShell above
  4. Configure the SharePoint MA with this account, and select Export as the Picture Flow Direction. If you are using the MIMSync toolkit the thumbnailPhoto attribute flow is already taken care of. If you are not, obviously you will need to configure the necessary attribute flow
  5. Perform Synchronization operations
  6. Execute Update-SPProfilePhotoStore once Synchronization is complete to create the thumbnail images used by SharePoint

Now of course, we have added the SP MA account as a Farm Administrator. As such it could be used to do just about anything to the farm. Least privilege is always a compromise and in this case the farm administrator is a requirement of the ProfileImportExportService – a SharePoint product limitation. Therefore, this approach is the best compromise available, and one that has already been accepted by security compliance in three enterprise customers using MIM and the SharePoint Connector. The bottom line is that if you don’t trust your MIM administrators, or indeed your SharePoint ones, you’ve got bigger security problems than a couple of accounts!

Also, none of this explains why without the policy configuration or overly aggressive permissions, the web service creates some pictures but not others. But life is just too short to worry about that rabbit hole!

Finally, it is always good to remember the mantra of the SharePoint Advanced Certification programs, “just because you can, doesn’t mean you should”. This post is not intended to promote the use of MIM for just the profile photo. Furthermore, using thumbnailPhoto in AD for photos is just one approach of many. For some organisations, especially the larger ones, this would be a spectacularly stupid implementation choice, and of course in many others Active Directory is not the master source of identity anyway.

 

s.


by Spence via harbar.net

Rencore Tech Talks - Episode 008 - Benefits of using Office 365 PnP in your organization, with Eric Overfield

Rencore Tech Talks - Episode 008 - Benefits of using Office 365 PnP in your organization, with Eric Overfield

Note. This episode was recorded 2017-02-19

This time I’m catching up with Eric Overfield. We're talking about the benefits of using Office 365 Patterns and Practices in your organization, and what it has to offer. Eric is sharing his best tips on what the core benefits are, where to find more information about it, etc.

Episode Guest, Eric Overfield

PixelMill Co-Founder, SharePoint Branding and UI Designer, Builder and Creator.
In 2016, Microsoft recognized Eric's contributions to SharePoint and Office 365 by awarding him as a Microsoft MVP, and then in 2017, Microsoft recognized his community leadership and general technology expertise by accepting him in to the Microsoft Regional Director program.

Some of the topics we brush on are:

Listen

Get the full transcript

All the best,
Tobias Zimmergren


by Tobias Zimmergren via Zimmergren's thoughts on tech

Friday, May 12, 2017

Beware the Office 365 Group -Based Site Regional Settings!!!

This is a quick post, yet it’s still an important one. We’re using more and more Office 365 Group -based SharePoint sites these days. Even when you know you aren’t going to use some of the goodies you end up with, this type of site is making more and more sense.

BUT, there’s a simple problem that can have longer-term ramifications. The default time zone for every new Group-based site we create is PDT, also known as UTC-08:00. You have to go into Site Settings to change it manually for every site you create this way. Since a lot of my clients are in EDT, this is tedious.

I’m guessing no one in Redmond even notices this, because PDT is their time zone. I spot it every time I create a new Group-based site during a migration because Sharegate warns me the time zones of the source and destination sites are different when I start to copy content across. (Yay, Sharegate!)

If you happen to be a non-US person, then ALL of the regional settings are likely to be wrong for you. I’ve checked, and there is no way to change the default here – unless it’s a VERY recent change.

Here are some Office 365 UserVoice suggestions you can run off to vote for:

 


by Marc D Anderson via Marc D Anderson's Blog

Wednesday, May 10, 2017

Get the SharePoint 2013 / 2016 Datapolis Process System for Free!

Community Blast Provided and Sponsored by Datapolis. This information was provided by vendor for community education on product.

With Microsoft not recommending SharePoint Designer workflows anymore and Flow not quite there yet, a lot of business are looking for a third-party option for the workflows in SharePoint and Office 365. One of those vendors, Datapolis, is offering a full-featured version of their Datapolis Process System (DPS) product for free to customers using SharePoint 2013 or 2016. Datapolis Process System is a visual workflow designer for SharePoint 2013 and 2016 that offers a robust set of tools to simplify workflow development and deployment. It enables users to design clear and understandable business process diagrams which support human-centered workflows.

Even if the name says Basic, you actually get the fully functional Datapolis DPS system with all the features! The only limitation is that you can have a maximum of 1000 workflow actions per month, which in my opinion is quite a lot and is able to provide enough workflows for a small department! If you have never seen what Datapolis looks like before ,check out this blog post by Gokan Ozcifci: Datapolis Process System – A Visual Workflow Designer Explained In 15 Bulletpoints

To get it, simply go to: http://ift.tt/2r1iYc9 and enter a few information about yourself and you are good to go!

The post Get the SharePoint 2013 / 2016 Datapolis Process System for Free! appeared first on Absolute SharePoint Blog by Vlad Catrinescu.


by Vlad Catrinescu via Absolute SharePoint Blog by Vlad Catrinescu

Tuesday, May 9, 2017

Dear Microsoft: Confusing Failures in “Modern” SharePoint Libraries

Moving documents around in the “modern” Document Libraries in SharePoint Online (Office 365) has certainly gotten easier. Instead of opening libraries in File Explorer, downloading the files, and then uploading them into the new location, we have nice options like Move to, Copy To, and Rename right in the toolbar. That a big upside.

Sadly, there are downsides. While the new capabilities are awesome, I’m finding that the UI is often confusing – both to me and my clients.

When you use one of these new file manipulation capabilities, you get a little feedback message ion the right side of the toolbar telling you how it went. Unfortunately, the message location isn’t all that obvious, and it usually feels as though the operation succeeded even when it didn’t.

Here’s and example of a Move to operation that failed for me. Note that the message says “1 item wasn’t moved”.

There’s no other feedback, and on my wide monitor, the message is way over to the right side. I don’t usually notice it.

In the case above, I did see the message, so I clicked on the message to see what was up. The reason I saw it was that I was looking for something wrong. A client of mine told me that a Move to wasn’t working. Every time she went back to the library, the documents were still in the same place, no matter how many times she moved them.

As you can see from the expanded message below, there was indeed an issue: there was a file with the same name in the destination location. I have two logical options, either to Keep both files or Replace the original.

 

The message really isn’t obvious enough, and I’ve been caught many times because I didn’t see that something I did failed. Even worse, to my client “SharePoint is broken”. I’m hoping the Product Group can come up with a more informative way to provide the feedback that something has gone wrong – which happens surprisingly often!

There is yet another “unfortunately” here, though. In my client’s case, even when there is no error in a Move to operation, the files are boomeranging right back into the original position after a few screen refreshes.

I also tried the Content and Structure page to see if moving the document that way would help, but still no dice. I checked to see if perhaps there was a workflow in play here causing issues, but that’s not the case. The only other thing I see which MIGHT be a problem is that the library has 4988 documents in it, which is pretty close to the 5000 list view threshold. I hate that threshold with a steaming passion (have I mentioned that here and here and  here and here and here and probably tens of dozens of other places?), but I can’t think why it would matter in this case.

So we’re at an impasse. The error messages aren’t so great, and the Move to operations are failing when when there aren’t errors. Maybe SharePoint really is broken.

Anybody? Anybody?


by Marc D Anderson via Marc D Anderson's Blog

Wednesday, May 3, 2017

Win a free Pass to SharePoint Fest Denver 2017

This May , I am really glad to speak at one of my favorite SharePoint conferences -SharePoint Fest.  SharePoint Fest usually does 4 conferences per year ( Denver – DC – Seattle – Chicago) , and I try to speak at all of them every year! To give back to the awesome SharePoint community, and to try to meet as many of the blog readers as possible, I talked to SharePoint Fest and they gave me one SharePoint Fest Denver 2017  Gold Pass to give away to the readers of my blog!

 

To enter you simply have to put your name and email, and you can get bonus entries if you follow @vladcatrinescu on Twitter and like the Absolute SharePoint page on Facebook!  Furthermore you can tweet about the  giveaway every day and get 9 bonus entries for every time you tweet!

Choose From over 80 Sessions and 19 Workshops in Multiple Tracks!

Join us May 30 – June 2, 2017 at SharePoint Fest – Denver . The largest independent SharePoint and Office 365 conference in North America, with the most sessions and the best speakers!

Over 90 Sessions in 10 Tracks

At SharePoint Fest – Denver there are sessions created for SharePoint administrators, software developers, information architects and power users which will ensure that you and your team walk away with as much knowledge as you desire to truly leverage SharePoint in your current environment!

Choose one complete learning track or mix & match based on what content best meets you and your organization’s current needs!

SharePoint Fest is a two-day training conference (plus two optional days of workshops) that will have over 80 sessions spread across multiple tracks that bring together SharePoint enthusiasts and practitioners with many of the leading SharePoint experts and solution providers in the country.

At SharePoint Fest, attendees will be able to attend workshops and seminars – taught by Microsoft Certified Trainers, Microsoft engineers, and Microsoft MCMs and MVPs – covering Enterprise Content Management, Implementation/Administration, Business Value, Search, Business Intelligence, Office 365 and SharePoint Development. Attendees will be able to choose one complete learning track or mix and match based on what content best meets their current needs.

There will be sessions created for SharePoint administrators, software developers, business analysts, information architects, and knowledge workers, which will ensure that attendees walk away with as much knowledge as they desire to truly leverage SharePoint in their current environment.

 SharePoint Fest Denver 2017 Giveaway

Enter the raffle below to win one of the two passes! You got until May 12th to enter! Remember that you can tweet about the giveaway every day in order to win more entries!

Click here to view this promotion.

The post Win a free Pass to SharePoint Fest Denver 2017 appeared first on Absolute SharePoint Blog by Vlad Catrinescu.


by Vlad Catrinescu via Absolute SharePoint Blog by Vlad Catrinescu