Author

Hugo Bernier

Browsing

Introduction

Last week, I attended the SharePoint 2018 Conference in Las Vegas. There were a lot of cool announcements and demos. The SharePoint team rocks!

One of the cool things that I noticed which has nothing to do with SharePoint was that a lot of presenters who showed code had a really cool command prompt that showed the node module they were in, and their Git branch status in a pretty "boat chart".

Console showing node module version and git branching information

I had seen this many times before, but never realized how much easier it was to get a sense of what's going on until I was watching someone else code on a big screen.

Of course, I set out to find and configure this awesome command-line on my workstation.

This article will show you how you too can install and configure this command line interface.

Cmder

During Vesa's awesome session, I paid close attention to the title of his command line window. It said Cmder.

I had seen Cmder before; the article Set up your SPFx development environment mentions Cmder in the Optional Tools section.

But the version of Cmder I had installed didn't have the fancy "boat chart" at the top that got my attention.

As it turns out, you need to download another custom prompt for Cmder that adds the Powerline (that's the real name for the "boat chart") at the top.

Here is how to install and configure Cmder with the Powerline command prompt:

Installing Cmder

  1. Go to http://cmder.net/ and download either the Mini pack or the Full pack.
  2. Unzip the package. Cmder is designed to be portable and to require no administrative privileges to run, so their instructions tell you to not install it in the Program Files folder (where you'll need administrative privileges). I placed it in C:\Users\[myusername]\AppData\Local\cmder.
  3. Open a command prompt in Administrative mode from the folder where you copied the Cmder files
  4. From the command-prompt, type:
    cmder /REGISTER ALL
  5. If you get an Access Denied error, you probably forgot to run the command in Administrative mode. If you don't know how to do that, type cmd from your Start menu, and right-click on Command Prompt and select Run as administrator.
  6. Cmder should be installed. You can verify by opening a new File Explorer window and right-clicking on a folder. You should get a Cmder Here option.
    Cmder Here

Unfortunately, if you open Cmder with that command line, you don't get the fancy Powerline.

Let's fix that!

Installing Cmder Powerline custom prompt

The Cmder Powerline custom prompt changes the Cmder prompt to include the following modifications:

  • The folder portion of the prompt is displayed in blue. The user's home folder is also replaced with a tilde (~).
  • If the current folder is an npm package, the prompt will display the package package name and version number in teal.
  • If the current folder is a Git repository, the prompt will display the branch name with a green colour if the branch is unchanged, or yellow if changes are found.

To install the Cmder Powerline custom prompt:

  1. Download the AnonymousPro font. You can do so by clicking on each TTF file in GitHub and selecting View Raw. For your convenience, here are the links to the raw files:
    Anonymice Powerline Bold Italic.ttf
    Anonymice Powerline Bold.ttf
    Anonymice Powerline Italic.ttf
    Anonymice Powerline.ttf
  2. Once dowloaded each font, install them by double-clicking them and selecting Install on each one of them.
  3. Copy all the .lua files from the Cmder Powerline source and place them in the config folder under the Cmder install folder.
  4. If you haven't done so yet, launch a Cmder window by going to the folder where you installed in and double-clicking on Cmder.exe 
  5. From the Cmder window, open the Settings by hitting Windows-Alt-P.
  6. From the Main settings area, select Anonymice Powerline font from the Alternative font (pseudographics, CJK, etc.) drop-down.
  7. In the Unicode ranges combo box, type E0A0-E0B0 and select Apply.
  8. Select Save settings to save your settings and return to the command prompt in Cmder.

CmderSettings

That's all you need to do.

Cmder with Visual Studio Code

If you want Cmder to show up in Visual Studio Code, follow these steps:

  1. Launch Visual Studio Code.
  2. From the File menu, select Preferences | Settings or use Ctrl-, (Control and comma). This will open your settings editor.
  3. In the right-pane of the settings editor (the one that's actually editable), insert the following JSON, just before the last , making sure to replace the path to Cmder with the path where you installed it.
    "terminal.external.windowsExec": "C:\\Users\\[myusername]\\AppData\\Local\\cmder\\Cmder.exe",
    "terminal.integrated.shell.windows": "cmd.exe",
    "terminal.integrated.shellArgs.windows" : [
    "/K",
    "C:\\Users\\[myusername]\\AppData\\Local\\cmder\\vendor\\init.bat"
    ],

That's all!

Conclusion

I hope that you'll find Cmder and the custom Cmder Powerline command-prompt useful in your SPFx development endeavors.

I know I did!

For More Information

Cmder.net lists more information about Cmder, including the super-powerful shortcut keys.

Amr Eldib is the brilliant mind behind the Cmder Powerline command-prompt.

Sahil Malik has detailed instructions (and a video!) to to integrate with Cmder Visual Studio Code.

Update

In the previous revision of this article, I had forgotten to include the steps to copy the .lua files to the config folder. It works much better when you include all the steps, it turns out 🙂

 

Introduction

One of the premises of SPFx is that, with it, third-party developers have the same set of tools that the SharePoint team has. So, if you like the look of an out-of-the-box web part you can, in theory, reproduce the same look and feel yourself.

A friend of mine needed to display a list of upcoming events, but the events are coming from a WordPress site that uses the WP Fullcalendar widget. They also really liked the look of events in SharePoint.

So, I thought: why not try re-creating the out-of-the-box SharePoint events web part, but instead of reading events from a SharePoint list (or group calendar), it would read from WordPress?

Since I was taking the challenge, I decided to also try to do these extra features:

  • Read events from multiple event providers, including RSS, iCal, and WordPress.
  • Support additional event providers without having to re-design the entire web part
  • Make the web part responsive, just like the SharePoint events web part, with a narrow view and a wide view.
  • Support "Add to my calendar"
  • Make it possible to add more web parts, for example, the Event Search web part, reusing as many of the components as possible.

This article will explain the various components of this web part. Because I tend to ramble on and on, I'll then explain how to write every component of the web part in separate articles so that you can read as much (or as little) as you want.

And if you really don't want to read the articles, you can always get the code. I won't be offended if you do.

The Web Part

Configuration

If you download the web part and run

gulp serve

you'll see the web part in your web part catalog.

Adding the web part

Note: when I designed this web part, I created an SVG icon for it. At the time of this writing, there was an issue with using custom base64-encoded SVG icons. If your icon doesn't look like the one in the picture above, don't worry.

When you add the web part, you'll be prompted to configure it:

Configure event feed

Selecting the Configure button (or selecting Edit web part in the web part's "toolbox") will launch the web part's property pane.

The web part's property pane

In the property pane, the Feed type drop-down lists all the service providers that the web part can find.

feedtype

The idea is that if we add more feed types, they'll automatically show up here. Let me know in the comments if you have an idea for a feed type you think we should add, or if you'd like to add one yourself just submit a pull request.

If you're running the web part in a development environment, it'll offer you a Mock option, which will add bogus events for testing purposes. In production, this option will not appear.

The Feed URL input box will prompt you to enter a URL for the feed you wish to display. It validates the URL format (but doesn't yet check the URL for results).

FeedUrl

Because the WordPress feed URL that I was using supports a from and to date value in the URL, I added the ability to automatically insert today's date and an end date (see below). All you have to do is to add a {s} where you want the start date and {e} where you want the end date.

The Date range drop-down allows you to select anything from Next week to Next year.

DateRange

Unlike the out-of-the-box SharePoint events search, I didn't add a All events option because there was no way (that I know of) in React to find the maximum possible date. I could have passed a null value around, but I didn't want to do that. If there are enough requests for it, I'll figure out a way to do All events later.

The only event provider that I know of which actually supports specifying a start and end date is WordPress. When a provider doesn't support filtering at the source, I just filter them out once I have received the events.

In the Advanced section, you can specify the Maximum number of events per page for the narrow view (the normal view just fits in as many events as it can on every page).

MaxPageSize

The default is (that's what SharePoint events does), but you can put as many as you want on every page. You can also put 0 if you don't want pagination for the narrow view.

When I was testing this web part, I kept on getting all sorts of CORS issues on some of the feeds I was using. So I added a Use proxy option, which -- you guessed it -- routes your requests through a proxy.

UseProxy

Finally, the web part can use the user's local storage to cache events it retrieves so that the web part doesn't fetch every. single. time. you. resize. the. page.

CacheDuration

You can set the cache duration from 0 to 1440 minutes (1 day) in 15 minute increments. Be careful, though, because it'll always cache a user's results from the time they last retrieved the events. So, if you set it to cache for a day, it'll wait an entire day before reloading events again no matter the time of the day. You should probably set it to half-a-day, just to be safe.

If you don't want to cache, you can set the cache duration to 0 and it'll refresh from the source every time. If your feed is slow, the web part will take forever to load every time.

The Apply button is just to make sure that the web part won't try to load the feed as you type the URL.

Assuming you configured the web part (and that my code works well), you'll get to see your events in a pretty calendar view soon enough.

The narrow view

When you put the web part in a single-column, or when the web part is less than 480 pixels wide, the web part renders a list view of events.

NarrowView.png

The list will render all the events retrieved and paginate the results according to the page size option you configured.

The dates are rendered to look like a page-a-day calendar.

DateBox

If the event spans over multiple days, the date box will render differently:

MultiDayDateBox

The pagination component renders a Previous and Next button, and helps manage how many pages to render, which page to render, etc. Unfortunately, Office UI Fabric doesn't offer a pagination control so I had to write my own.

Of course, if I wasn't so lazy, I would have created a full pagination control with page numbers, and all, but the SharePoint events web part doesn't show the page numbers so I didn't do it. If there is enough demand for it, I'll make the component more generic and add the page numbers.

The Normal view (or carousel view)

When you view the web part on a full page (or when it is wider than 480 pixels), the web part switches to a carousel view.

Carousel View

The carousel view is responsive and renders between 1 and 4 events per page.

Like the SharePoint events web part, there is a next and previous arrow when you mouse over the calendar, with dots at the bottom to indicate what page you're on.

CarouselNav

Finally, the Add to my calendar button creates a dynamic ICS file, allowing you to import the event to most calendars on most devices.

Conclusion

In upcoming articles, I'll show how to build this, component by component.

I hope that you'll enjoy it.

Why would you want to inject CSS?

Since Microsoft introduced Modern Pages to Office 365 and SharePoint, it is really easy to create beautiful sites and pages without requiring any design experience.

If you need to customize the look and feel of modern pages, you can use custom tenant branding, custom site designs, and modern site themes without incurring the wrath of the SharePoint gods.

If you want to go even further, you can use SharePoint Framework Extensions and page placeholders to customize well-known areas of modern pages. Right now, those well-known locations are limited to the top and bottom of the page, but I suspect that in a few weeks, we'll find out that there are more placeholder locations coming.

But what happens when your company has a very strict branding guideline that requires very specific changes to every page? When your customization needs go beyond what's supported in themes? When you need to tweak outside of those well-known locations?

Or, what if you're building a student portal on Office 365 and you need to inject a custom font in a page that is specifically designed to help users with dyslexia?

That's when I would use a custom CSS.

Here be dragons!

Before you go nuts and start customizing SharePoint pages with crazy CSS customizations, we need to set one thing straight:

With SharePoint, you should always colour within the lines. Don't do anything that isn't supported, ever. If you do, and you run into issues, you're on your own.

A badly coloured version of the SharePoint logo.
With SharePoint, you should always colour within the lines

Remember that Microsoft is constantly adding new features to SharePoint. The customizations you make with injecting custom CSS may stop working if the structure of pages change.

What's worse, you could make changes to a page that prevents new features from appearing on your tenant because you're inadvertently hiding elements that are needed for new features.

With custom CSS (and a CSS zen master), you can pretty much do anything you want. The question you should ask yourself is not whether you can do it, but whether it is the right thing to do.

Enough warnings! How do I inject custom CSS?

It is very easy. In fact, I'm probably spending more time explaining how to do it than it took me to write the code for this. If you don't care about how it works, feel free to download the source and install it.

Using SharePoint Framework Extensions, you can write code that you can attach to any Site, Web, or Lists. You can control the scope by how you register your extensions in your SharePoint tenant.

With an extension, you can insert tags in the HTML Head element.

I know what you're thinking: we can just insert a STYLE block at in the HEAD element and insert your own CSS. Sure, but what happens when you need to change your CSS? Re-build and re-deploy your extension? Nah!

Instead, how about inserting a LINK tag and point to a custom CSS that's located in a shared location? That way, you can modify the custom CSS in one place.

You can even have more than one custom CSS and use your extension properties to specify the URL to your custom CSS. In fact, you can add more than one extension on a site to combine multiple custom CSS together to suit your needs.

Building your custom CSS injection extension

You too can design a beautiful SharePoint site that looks like this:

sampleresults
I'm really a better designer than this. I just wanted a screen shot that smacks you in the face with a bright red bar and a custom round site icon. It hurts my eyes.
  1. Start by creating your own custom CSS (something better than I did, please). For example, the above look was achieved with the following CSS:
    .ms-compositeHeader {
        background-color: red;
    }
    .ms-siteLogoContainerOuter {
        border-radius: 50%;
        border-width: 3px;
    }
    .ms-siteLogo-actual {
        border-radius: 50%;
    }
  2. Save your custom CSS to a shared location on your SharePoint tenant. For example, you could save it in the Styles Library of your root site collection. You could also add it to your own Office 365 CDN. Make note of the URL to your CSS for later. For example, if you saved your custom CSS as contoso.css in the Styles Library of your tenant contoso.sharepoint.com, your CSS URL will be:
https://contoso.sharepoint.com/Style%20Library/contoso.css

which can be simplified to:

/Style%20Library/custom.css
  1. Create an SPFx extension following the instructions provided in the Build your first SharePoint Framework Extension (Hello World part 1) article. (Hey, why improve what's already perfect?).
  2. Change the props interface that was created for your ApplicationCustomizer class and replace the description property to cssurl. For example, my ApplicationCustomer class is called InjectCssApplicationCustomizer so my props interface is going to be called IInjectCssApplicationCustomizerProperties. Like this:
  1. Change your onInit method to insert a LINK element pointing to your cssurl property.
  1. In your serve.json located in the config folder, change the pageUrl to connect to a page on your tenant. Also change the cssurl property to pass the URL to the custom CSS you created in steps 1-2, as follows:
    1. Test that your extension works by running gulp serve. When prompted to allow debug scripts, select Load debug scripts.

DebugScriptWarning

You can now tweak your custom CSS to suit your needs, continuing to hit refresh until you're happy with the results.

Deploying to your production tenant

When ready to deploy, you need to bundle your solution, upload it to the app catalog, and enable the extension on every site you want to customize.

To make things easy, you can add an elements.xml file in your SharePoint folder and pre-configure your custom CSS URL. Here's how:

  1. In your solution's sharepoint/assets folder, create a new file called elements.xml. If you don't have a sharepoint folder or assets sub-folder, create them.
  2. Paste the code below in your elements.xml:
  1. Make sure to replace the custom action TitleClientSideComponentId to match your own extension. You can find those values in your InjectCssApplicationCustomizer.manifest.json, under id and alias.
  2. Change the ClientSideComponentProperties to point to your CSS URL. Pay attention to URL encode the values (e.g.: a space becomes %20).
  3. Run gulp bundle --ship to bundle your solution/
  4. Run gulp package-solution --ship
  5. Drag and drop the .sppkg file that was created in your sharepoint/solution folder to your tenant's app catalog.

If you selected to automatically deploy to all site collections when building the extension, you're done. If not, you'll need to go to every site and add the extension by using the Site Contents and Add an App links.

Conclusion

You can easily inject custom CSS in every modern page of your SharePoint tenant by using an SPFx extension, but be careful. With great CSS power comes great SharePoint responsibility.

You can get the code for this extension at https://github.com/hugoabernier/react-application-injectcss

I'd love to see what you're doing with your custom CSS. Let me know in the comments what you have done, and -- if you're interested -- share the CSS.

I hope this helps?

Agile North is growing fast!

As it turns out, there are many people who are passionate about Agile methodologies who live North of the GTA. While there are plenty of events about Agile in Toronto, it isn't always easy to juggle work, family life, and attending events downtown.

That's why we started Agile North!

If you an Agile enthusiast in and live in Barrie, Orillia, Gravenhurst (or anywhere else north of Toronto -- we won't judge!) Agile North is always looking for enthusiastic members!

(Also, if you're a co-worker of mine who doesn't want to disappoint me and suddenly find that their SharePoint site collection was irreversibly deleted, you should come too)

We are open to anyone interested in learning, teaching, or just sharing about Agile methodologies.

Just so we're clear: we're talking about the software development methodology, not a new form of yoga :-).

Our next Agile North Meetup will be May 10th at 7 PM ! We'll be hosting an Open Space for networking, and brainstorming your latest struggles. This is the time to share your experience, ask others for insights, or just hang-out with other people who too, just a few hours ago, were explaining to someone else why it isn't a great idea to follow a "big bang" approach (or some other equally bad idea).

Unlike our previous events, there will be no workshops or guest speakers; just an opportunity to meet other like-minded people.

(We'll also mock people who couldn't make it, because they won't be there to defend themselves)

As usual, our hosts at Creative Bean will keep their doors open (and their coffee brewing) just for our event.

Krista Parker, the Agile guru and co-host of this group promises that she'll have swag. (But we heard that before, didn't we?)

So, what are you waiting for? Coffee, swag, smart people! Register now (it's free!)

 

In part 1 of this article, I introduced the concept for an SPFx extension that adds a header to every page, showing the classification information for a site.

In part 2, we created an SPFx extension that adds a header that displays a static message with the security classification of a site.

In part 3, we learned more about property bags and learned a few ways to set the sc_BusinessImpact property (a property we made up) of our test sites to LBI, MBI, and HBI.

In part 4, we wrote the extension that reads from a site's property bags and displays the classification in the header.

In this part, we will clean up a few things, package and deploy the extension.

Preparing to deploy to production

The extension we wrote in parts 1-4 of this article works, but it isn't really production ready.

First, we'll want to change the code to only display the extension if a web can find a site's information security classification in its property bag. That way, if you chose to deploy the extension to production, you won't have to worry about affecting sites that do not have a security classification (although, it is recommended that every site has a classification, even if it is LBI by default).

Second, we'll change the hard-coded hyperlink to point to a page on your tenant that provides handling instructions for each security classification.

Then we'll remove all those hard-coded strings and replace them with localized strings.

Let's get started!

Conditionally display the extension

So far, our code assumes that every site has a security classification -- which is the right thing to do if you want to be compliant.

However, there are cases where you may want to deploy this extension in production and not display a security classification until you've actually applied a classification to a site.

To do this, we'll change our code a little bit.

  1. In ClassificationHeader.types.ts, we'll change the default classification to be undefined. So, we're changing this line:
    export const DefaultClassification: string = "LBI";
    

    to this line:

    export const DefaultClassification: string = undefined;
    
  2. Now let's change the render method in ClassificationHeader.tsx to handle an undefined value and skip rendering if there is no security classification. Change this code:
    var barType: MessageBarType;
        switch (businessImpact) {
          case "MBI":
            barType = MessageBarType.warning;
            break;
          case "HBI":
            barType = MessageBarType.severeWarning;
            break;
          default:
            barType = MessageBarType.info;
        }
    

    to this code:

        // change this switch statement to suit your security classification
        var barType: MessageBarType;
        switch (businessImpact) {
          case "MBI":
            barType = MessageBarType.warning;
            break;
          case "HBI":
            barType = MessageBarType.severeWarning;
            break;
          case "LBI":
            barType = MessageBarType.info;
            break;
            default:
            barType = undefined;
        }
    
        // if no security classification, do not display a header
        if (barType === undefined) {
          return null;
        }
    

When you're done, the code should look like this:

Test your extension again, making sure to try with an LBI, MBI, and HBI site, as well as any other site that hasn't been classified yet (i.e.: that doesn't have a security classification property bag value defined yet).

Linking to handling procedures

Since the first part of this article, I have been using a fake URL instead of an actual link to handling instructions. Let's set a default URL to display proper handling procedures.

  1. Start by creating a page on your SharePoint site that explains to your users how they should properly handle information based on their security classification. You can create one page, or (ideally) create a separate set of URLs for each classification.
  2. In ClassificationHeader.types.ts, we'll add a new constant to store the URL to the new handling procedures page you created. If you created more than one, feel free to add more than one constant. If you don't want to use a hyperlink, just set it as undefined. Add this line of code, with the URL of your choice:
    export const DefaultHandlingUrl: string = "/SitePages/Handling-instructions.aspx";
    

    Remember that your URLs should be absolute (e.g.: https://yourtenant.sharepoint.com/sitepages/handling-instructions.aspx) or at least relative to the root (e.g.: /sitepages/handling-instructions.aspx), because your links will get rendered on every page in the site.

  3. Now let's change the render method in ClassificationHeader.tsx to use the handling URL in the hyperlink. Change this code:
 public render(): React.ReactElement {
    // get the business impact from the state
    let { businessImpact } = this.state;

     // change this switch statement to suit your security classification
     var barType: MessageBarType;
     switch (businessImpact) {
       case "MBI":
         barType = MessageBarType.warning;
         break;
       case "HBI":
         barType = MessageBarType.severeWarning;
         break;
       case "LBI":
         barType = MessageBarType.info;
         break;
         default:
         barType = undefined;
     }
 
     // if no security classification, do not display a header
     if (barType === undefined) {
       return null;
     }
     
    return (
      
        This site is classified as {this.state.businessImpact}. Learn more about the proper handling procedures.
      
    );
  }

to this code (note that you'll need to add an import for DefaultHandlingUrl at the top (not shown here):

public render(): React.ReactElement {
    // get the business impact from the state
    let { businessImpact } = this.state;

    // ge the default handling URL
    let handlingUrl: string = DefaultHandlingUrl;

    // change this switch statement to suit your security classification
    var barType: MessageBarType;
    switch (businessImpact) {
      case "MBI":
        // if you'd like to display a different URL per classification, override the handlingUrl variable here
        // handlingUrl = "/SitePages/Handling-instructions-MBI.aspx"
        barType = MessageBarType.warning;
        break;
      case "HBI":
        barType = MessageBarType.severeWarning;
        break;
      case "LBI":
        barType = MessageBarType.info;
        break;
      default:
        barType = undefined;
    }

    // if no security classification, do not display a header
    if (barType === undefined) {
      return null;
    }

    return (
      
        This site is classified as {this.state.businessImpact}.
        {handlingUrl && handlingUrl !== undefined ?
           Learn more about the proper handling procedures.
          : null
        }
      
    );
  }

When you're done, the code should look like this:

Localizing resources

There are a few places in our code where we display some text that is hard-coded in the code.

Being of French-Canadian origins, I am especially sensitive to the aspect of localization; you shouldn't hard-code text, dates, numbers, currencies, and images in code if you can avoid it. Not only because it makes it easier to support easily support another language, but also because it makes it easy to maintain the text in your solution without wading through code.

Flashback: I remember working on a project where the geniuses in the marketing department changed the name of the product about 17 times while we were building it. Every time, the team would have to scour through the code to change the references to the product name. Once they learned the wonders of localization and string resources, they could change all references to the product name in a few seconds (they still gave the marketing department a hard time, though) 🙂

You only need to localize the code where something that is displayed could potentially change in a different locale. It's not just a different language, dates, numbers and currencies are displayed differently depending on where you live, even if you speak English. You don't need to worry about debugging code (e.g.: when you write to the console) unless you want people who speak in a different language to debug your code too.

Luckily, our code has only a few strings literals to worry about, and they're all in the ClassificationHeader.tsx.

You don't have to localize your code. But you should. So follow these instructions if you want to be a better SPFx developer:

  1. In the myStrings.d.ts file, located in the loc folder (source | extensions | classificationExtension | loc), add the following two lines to the
    IClassificationExtensionApplicationCustomizerStrings interface:
        "ClassifactionMessage": "This site is classified as {0}. ",
        "HandlingMessage": "Learn more about the proper handling procedures."
  2. In the en-us.js file, add two more lines below the "Title" line, making sure to add a comma at the end of the line that already exists:
    ClassifactionMessage: string;
    HandlingMessage: string;
  3. Now go to the ClassificationHeader.tsx file and add a reference to your localized strings at the top of the file, below all the other import statements:
    import * as strings from "ClassificationExtensionApplicationCustomizerStrings";
  4. Finally, replace the code in the render method to use the localized strings. Note that we're replacing the placeholder in the localization string with the classification label. We could have simply concatenated the values, but every language has a different syntax structure, and doing it this way makes it easier to deal with different language syntax.
    return (
            {strings.ClassifactionMessage.replace("{0}",this.state.businessImpact)}
            {handlingUrl && handlingUrl !== undefined ?
               {strings.HandlingMessage}
              : null
            }
        );

You code should look like this:

Optional: using configuration properties

The eagle-eyed reader may have noticed two things:

  1. There is a testMessage property that is defined in the ClassificationExtensionApplicationCustomizer.ts that we never use.
  2. The ClassificationPropertyBag, DefaultClassification, and
    DefaultHandlingUrl are all hard-coded. If you ever need to change any of the configuration items, you'd have to change the code, re-build, and re-deploy.

Thankfully, the SPFx team did a great job and designed SPFx extensions to support configuration properties. I don't know if that's what they're actually called, but that's what I call them 🙂

The testMessage is a sample configuration property that is created for us when we use the Yeoman generator. We can replace this property to anything that suits us. In our case, the ClassificationPropertyBag, DefaultClassification, and DefaultHandlingUrl.

To do this, let's follow these steps:

  1. Open ClassificationExtensionApplicationCustomizer.ts and replace the IClassificationExtensionApplicationCustomizerProperties interface code so that it looks like this:
    export interface IClassificationExtensionApplicationCustomizerProperties {
      ClassificationPropertyBag: string;
      DefaultClassification: string;
      DefaultHandlingUrl: string;
    }
  2. In the ClassificationHeader.types.ts file, add the same properties to the IClassificationHeaderProps interface by replacing the code to this:
    export interface IClassificationHeaderProps {
        context: ExtensionContext;
        ClassificationPropertyBag: string;
        DefaultClassification: string;
        DefaultHandlingUrl: string;
    }
  3. While you're in there, make sure to remove the other definitions of ClassificationPropertyBag, DefaultClassification, and DefaultHandlingUrl.
  4. Now back in ClassificationExtensionApplicationCustomizer.ts pass the properties to the ClassificationHeader props by replacing this code:
    const elem: React.ReactElement = React.createElement(ClassificationHeader, {
            context: this.context
          });

    to this:

    const elem: React.ReactElement = React.createElement(ClassificationHeader, {
            context: this.context,
            ClassificationPropertyBag: this.properties.ClassificationPropertyBag,
            DefaultClassification: this.properties.DefaultClassification,
            DefaultHandlingUrl: this.properties.DefaultHandlingUrl
          });
    
  5. To prevent any issues from not having any configuration information, let's add some code at the top of the onInit method:
    if (!this.properties.ClassificationPropertyBag) {
          const e: Error = new Error("Missing required configuration parameters");
          Log.error(LOG_SOURCE, e);
          return Promise.reject(e);
        }
  6. Finally, find any references to ClassificationPropertyBag, DefaultClassification, or DefaultHandlingUrl in ClassificationHeader.tsx and replace them to this.props.[property]. For example, replace ClassificationPropertyBag to this.props.ClassificationPropertyBag.

When you're done, the code should look like this:

This will allow you to pass configuration properties to the extension without having to change code.

To test this:

  1. Find serve.json in the config folder.
  2. Replace the "properties" attribute to pass the configuration we need, from this:
    "properties": {
                "testMessage": "Test message"
              }
    

    to this:

    "properties": {
                "ClassificationPropertyBag": "sc_x005f_BusinessImpact",
                "DefaultClassification": "",
                "DefaultHandlingUrl":"/SitePages/Handling-instructions.aspx"
              }
  3. Launch the extension by using gulp serve and test that the extension still works.

Note: if you're planning on debugging the extension, don't forget that the URL has now changed with these new properties. Follow the instructions earlier to copy the URL to the launch.json file.

Deploying to production

Assuming that everything works, we're only a few steps away from deploying to production:

  1. When you deploy the solution that includes the extension, SharePoint looks for the default configuration in the elementx.xml and uses whatever it found.  Since we changed the default properties, let's go change the elements.xml file (you can find it in the sharepoint folder) to the following:
    <Elements xmlns="http://schemas.microsoft.com/sharepoint/">
        <CustomAction
            Title="ClassificationExtension"
            Location="ClientSideExtension.ApplicationCustomizer"
            ClientSideComponentId="4017f67b-80c7-4631-b0e5-57bd266bc5c1"
            ClientSideComponentProperties="{"ClassificationPropertyBag":"sc_x005f_BusinessImpact","DefaultClassification":"","DefaultHandlingUrl":"/SitePages/Handling-instructions.aspx"}">
        </CustomAction>
    </Elements>
    
  2. From the Terminal pane type:
    gulp bundle --ship
  3. Followed by:
    gulp package-solution --ship
  4. Navigate to your tenant's App Catalog  (e.g.: https://yourtenant.sharepoint.com/sites/apps) site and navigate to the Apps for SharePoint library.
  5. Find the folder where the package was created by going to Visual Studio Code and finding the sharepoint | solution folder, right-clicking and selecting Reveal in explorer.
  6. Drag and drop the classification-extension.sppkg solution package to the Apps for SharePoint library.

You should be able to go visit your classified sites and see the extension at work. If it doesn't work, you may have elected to not automatically deploy the solution to every site when you built the extension. If that's the case, you'll need to add the extension to the sites by using Add an App.

Conclusion

It took 5 parts to describe how to build the extension, but we successfully created an extension that reads a site's security classification from its property bag and displays the site's classification in a label.

In our article, we manually set the classification by modifying the property bag, but in the real world, we'll want to use an approach that automatically classifies sites when they are created.

The code for this application (including any modifications I may have made to it since publishing this article) can be found at: https://github.com/hugoabernier/react-application-classification.

If you're interested in seeing how we might approach automatically classification, let me know in the comments and maybe I'll create another (series of) article(s).

I hope this helps!?

 

In part 1 of this article, I introduced the concept for an SPFx extension that adds a header to every page, showing the classification information for a site.

In part 2, we created an SPFx extension that adds a header that displays a static message with the security classification of a site.

In part 3, we learned more about property bags and learned a few ways to set the sc_BusinessImpact property (a property we made up) of our test sites to LBI, MBI, and HBI.

In this part, we will finally get to add code to our extension that reads the property bag of the current site and displays the appropriate site classification label.

Reading the classification from the site's property bag

You can get the property bag of a site using a simple REST call to https://yourtenant.sharepoint.com/sites/yoursite/_api/web/allProperties  but it is even easier to use the SP PnP JS library make queries like these.

Adding the SP PnP JS library to your project

Open the Visual Studio Code solution you created in part 2 and perform the following steps:

  1. Open the terminal pane (CTRL-`).
  2. From the terminal pane, type:
    npm i sp-pnp-js --save
  3. We'll need to update the ExtensionContext in the IClassificationHeaderProps interface. It will allow the ClassificationHeader component to access the context used to make PnP calls. We'll also add a couple variables to the IClassificationHeaderState interface: one to keep the classification we'll retrieve from the property bag, and one to keep track if we're still loading the page.
    The code also defines the classification property bag name (sc_BusinessImpact) and the default classification ("LBI") for when it doesn't find a classification for a site. Feel free to change either of those values to what makes sense for your needs.
    Simply copy and paste the following code to ClassificationHeader.types.ts:
  1. Now we need to pass the ExtensionContext to the ClassificationHeader component. Open the ClassificationExtensionApplicationCustomizer.ts file and paste the following code (line 53 is the only line that was updated):
  1. Now we just need to make the ClassificationHeader component query the property bag when component mounts, save the classification in the state variable and change the render code to display the classification. Just copy the code below to ClassificationHeader.tsx:

That should be it, let's try it!

  1. From the Terminal pane in Visual Studio Code, type:
    gulp serve
  2. It should launch the browser to the page you had set up in part 2, in serve.json. If prompted to run debug scripts, accept.
  3. Assuming that the default page is not one of your LBI, MBI, or HBI test pages, you should get the default value classification (e.g.: LBI).
  4. Change the first part of the browser's URL to point to your HBI page (change the part before ?debugManifestsFile=...), and it should tell you that the site is classified HBI.
  5. Repeat step 4 with your LBI and MBI sites and make sure that you get the right messages.

If everything went well, your sites displayed the right classification, but the message bar didn't change from the default yellow warning. Let's change that.

Changing the message bar type based on the site classification

  1. Change the render method of the ClassificationHeader.tsx to display a message bar type "warning" for MBI, and "severeWarning" for HBI, and "info" for everything else. The render method should look like this:

Try the LBI, MBI, and HBI test pages again just like you did before, except this time, you should get the following:

TestMBI2
MBI Test Site
TestHBI
HBI Test Site

Help! The extension stops loading when I changed pages and it stopped prompting me if I want to load the debug scripts!

You most likely forgot to include the part after ?debugManifestsFile=… in the URLTry to launch the extension again (gulp serve) and copy the part of the URL with the ? to your test pages.

(I know because I did this a few times)

How to debug the extension

In theory, the extension should work and load at least the default LBI message. But what if you want to debug the extension?

Here is a simple trick:

  1. Launch your extension by using gulp serve as you did above.
  2. Copy the everything in the URL from the ?. It should look like something like this:
    ?debugManifestsFile=https%3A%2F%2Flocalhost%3A4321%2Ftemp%2Fmanifests.js&loadSPFX=true&customActions=%7B%224017f67b-81c7-5631-b0e5-57bd266bc5c1%22%3A%7B%22location%22%3A%22ClientSideExtension.ApplicationCustomizer%22%2C%22properties%22%3A%7B%22testMessage%22%3A%22Test%20message%22%7D%7D%7D
  3. In your Visual Studio Code project, find launch.json under the .vscode folder.
  4. If you don't have such a file, you probably need to install the Chrome Debugger Extension for Visual Studio Code. Just go to https://aka.ms/spfx-debugger-extensions and follow the instructions to install it.
  5. Find the configuration entry that starts with "name": "Hosted Workbench" and paste the ugly URL you got in step 2 at the end of the URL marked "url". This will add the instructions to load the extension in debug mode.
  6. From the Terminal pane, type:
    gulp serve --nobrowser
  7. This will start the local web server but won't launch the browser.
  8. Set a few breakpoints where you want to debug the code by using F9. For example, the render method of the ClassificationHeader component.
  9. From the Debug menu in Visual Studio Code, select Start Debugging and it should launch Chrome to the page you specified in launch.json, prompt you to login, then prompt you to run Debug scripts. Accept and you should be able to debug through the code.

This should be all for today. Next part of this article will clean up some of the code, add localized strings, and prepare the code for production and deploy it!.

 

In part 1 of this article, I introduced the concept for an SPFx extension that adds a header to every page, showing the classification information for a site. In part 2, we created an SPFx extension that adds a header that displays a static message with the security classification of a site.

Yes, static. As in hard-coded. I try to write these articles for people who don't have as much experience with developing SPFx extensions, so I included the step-by-step instructions.

In this article, we'll discuss how we use property bags to store the security classification.

What are property bags anyway?

Property bags is a term used when describing a serialized list of properties. It isn't unique to SharePoint -- I remember using them in the good old C days, but SharePoint has been using them for a long time. Remember this screen from SharePoint Designer?

AncientBag

Property bags are a convenient way to store a whole bunch of properties of things. In SharePoint, a property bag can be applied to the File, Folder, List or Web-level in SharePoint. When set at the Web level, it can be for a Site Collection or Site -- at least that's what MSDN said about SharePoint 2013.

The great thing about property bags in SharePoint is that they are attributes of their parent, which means they are protected the same way their parents are.

In theory, you could use a custom SharePoint list, add it to every site, manage the permissions, and add one row per property you want to store about each site, but that would be painful.

You could also store an XML or JSON file in every site that does the same, but then you'd have to write the code to create and store the file, protect it, and read it.

...or you could use the out-of-the-box mechanism to store metadata about a site, and let SharePoint create it and protect it. Also, you could use the countless ways to access the property bags (SharePoint designer, PowerShell, CSOM, PnP JS, Office 365 CLI, etc.).

So, for our Classification extension, we'll store and read from the site's property bag.  To pay a homage to Microsoft's own solution to Implement a SharePoint site classification solution, we'll use sc_BusinessImpact for the property name. You could name it anything you want, but you probably want to keep it somewhat unique.

Here is what the property bag looks like in SharePoint Designer 2013:

PropertyBagSharePoint

Storing custom properties in site property bags

In the previous article, I asked you to create test sites for LBI, MBI, and HBI tests. Now we'll store the values LBI, MBI, and HBI in the sc_BusinessImpact property in each respective site's property bags.

There are a few ways to do this, but since this is just for testing purposes, I'll offer two ways to cheat.

Setting a custom property using SharePoint Design 2013

Yes, SharePoint Designer 2013 is still around. and it works with Office 365! What's more, you can use it to easily set custom property bag values using it!

  1. Using SharePoint Designer 2013, go to File Open SharePoint Site and type the URL to your LBI site you created in the previous article in the Site name field.
  2. Once connected, select Site Options from the toolbar.SiteOptions
  3. On the Parameters tab in the Site Options dialog, you'll see the list of properties in the property bag. Don't mess with them.
    SiteOptionsNoPrp
  4. Select Add... to add a new property.
  5. In the Add Name and Value dialog box, type sc_BusinessImpact in the Name field, and LBI in the Value field. Select OK.
    SiteOptionsAdd
  6. Back on the Site Options dialog, you should see the new property you created. Select OK to dismiss the Site Options dialog.
  7. Repeat steps 1-6 with your MBI and HBI site, making sure to use MBI and HBI, respectively, in the Value field for step 5.

Storing custom properties using the Chrome SharePoint Editor Extension

If you haven't installed it yet, the Chrome SharePoint Editor Extension is a wonderful Chrome Extension that makes it easy to manage property bags. This is how to use it.

  1. Using Chrome, browse to your LBI site.
  2. Hit F12 or CTRL-SHIFT-I to open the Developer Tools.
  3. Find the SharePoint tab (should be one of the last ones, after Audit).
  4. From the Chrome SharePoint Editor navigation, select Web properties.
  5. In the New Property Name field, type sc_BusinessImpact
  6. In the New Property Value field, type LBI
  7. Select Add Property to submit your changes.PropertyBagusingspeditor.png
  8. You should see a toast notification at the bottom right of the screen indicating it worked.
  9. Repeat steps 1-8 with your MBI and HBI site.

What to do if you get errors setting the property bag values

It is possible that you run into an issue where SharePoint actively refuses to set the property bag. To resolve this issue, you need to temporarily set DenyAddAndCustomizePages to 0 on each site. To do so:

  1. Launch the SharePoint Online Management Shell.
  2. From the command-line, type:
    Connect-SPOService
  3. When prompted for it, enter the URL to your admin site (e.g.: https://mytenant-admin.sharepoint.com) and hit Enter.
  4. You'll most likely be prompted to log-in. Enter your credentials.
  5. Once connected, type the following, making sure to enter the URL to your LBI site:
    Set-SPOSite https://yourtenant.sharepoint.com/sites/testlbi -DenyAddAndCustomizePage 0
  6. Repeat the previous step with your MBI and HBI site URLs, then try again one of the two methods to set your site property bags.

If you wish to do so, you can re-run the above commands setting DenyAddAndCustomizePages to 1 after you're done setting your property bag values. Thanks to Asish Padhy for the inspiration to set DenyAddAndCustomizePages.

You may think "Bah, I can just go to the SharePoint Admin site, and go to the settings, and enable this", but as My SharePoint Log pointed out, you'll have to wait up to 24 hours for this to take effect.

Part III Conclusion

There are plenty of other methods to set property bag values, but the ones I listed above seemed the easiest.

I didn't spend too much time on how to set up the values because, in a real-world scenario, you shouldn't be setting the security classification property bag value by hand. It should be automatically configured when the site is created.

That's something we'll get to that much later. For now, we'll focus on changing our hard-coded message bar and make it display the actual site classification.

In the next part of this article, we'll finally return to code and retrieve the site classification from the property bags and display the appropriate message.

In part 1 of this article, I introduced the concept for an SPFx extension that adds a header to every page, showing the classification information for a site.

We'll actually do the coding in this article!

Creating the SPFx extension solution

  1. Using the command line, create a new project directory
md classification-extension
  1. Change the current directory to your new project directory
cd classification-extension
  1. Launch the Yeoman SharePoint Generator:
yo @Microsoft/sharepoint
  1. When prompted for the solution name, accept the default classification-extension.
  2. For the baseline package select SharePoint Online only (latest).
  3. When asked Where do you want to place the files? accept the default Use the current folder.
  4. When asked if you want to allow the tenant admin the choice of being able to deploy the solution to all sites immediately respond Yes (unless you really want to deploy it to every single site manually).
  5. When asked for the type of client-side component to create select Extension.
  6. Select Application Customizer when asked about Which type of client-side extension to create.
  7. Almost there. For Application Customizer name, use ClassificationExtension. Keep this name to less than 40 characters always.
  8. For Application Customizer description, enter Displays the site's information security classification.
  9. What the miracle that is Yeoman creating the project for you. It'll take a few minutes. Eventually, it'll say Congratulations! Solution classification-extension is created. Run gulp serve to play with it!. We're not quite ready, yet.

Adding a static header

Now that the solution is created, we'll quickly add a header to text that our extension is working. We'll add the dynamic code later.

  1. Launch Visual Studio Code and open the new project you created. From the command line, type:
code .
  1. We could add code to directly manipulate the DOM and insert elements, but I prefer keeping my components in separate .TSX files. It keeps everything simple (because every component is responsible for only one thing), which makes my brain happy. It also keeps everything modular. From your project's file explorer pane, navigate to src | extensions | classificationExtension
  2. Right-click and select New Folder.
    AddingaFolder
  3. Type components as the folder name.
  4. On the newly created folder, right-click and select New File.
  5. Name the new file ClassificationHeader.types.ts. This file will contain all the types that the Footer component (to be created soon) will use.
  6. In the ClassificationHeader.types.ts file, paste the following (placeholder) code:

7. Now right-click the components folder and select New File. Name your new file ClassificationHeader.tsx.

8. Paste the following code in your ClassificationHeader.tsx.

9. Finally, find the ClassificationExtensionApplicationCustomizer.ts file that was created by Yeoman and replace its content with the following code:

What the code does:

  • ClassificationExtensionApplicationCustomizer.ts: looks if there is a placeholder available called "Top". If there is, it calls the ClassificationHeader.tsx component to render. You are never supposed to assume that a placeholder is there, so check every time.
  • ClassificationHeader.tsx: renders a static/hard-coded Office UI Fabric MessageBar that says the site is MBI, and provides a fake link.
  • ClassificationHeader.types.ts: defines a property and state interface for the ClassificationHeader component. Right now, both are empty but we'll add some fields in future versions of this code.

Testing that the extension works

Unlike SPFx web parts, you can't text your extensions in the SPFx Workbench. I hope that it'll be fixed in future versions of the workbench, but until then you need to test it on a real page on your Office 365 tenant.

Here is how to test your extension:

    1. In Visual Studio Code, find serve.json (located in the config folder).
    2. Find an entry that looks like https://contoso.sharepoint.com/sites/mySite/SitePages/myPage.aspx and replace it to the url to a test page on your Office 365 tenant. For example: https://yourtenant.sharepoint.com/SitePages/Test-extension.aspx. There should be two instances to replace.
    3. From the Terminal pane (hit CTRL-`) type:
      gulp serve
    4. After a few moments, your favourite browser should launch and you should get a scary warning:DebugScriptWarning
    5. Select Load debug scripts and the page should load with our fancy message bar at the top.
      TestMBI

 

I would consider that a success! Except, of course, that the extension is hard-coded to say that the site is classified as MBI.

But first, we need to create some test sites and classify them.

Creating test sites

In your Office 365 tenant, create three new sites. You can use the Communication or Team site template, as long as you use a modern template.

The three sites will be:

  • TestLBI
  • TestMBI
  • TestHBI

You can use any naming convention you'd like, just make note of the urls for each site because you'll need them in the next step.

We'll set the property bags on each of the three testing sites, but -- unfortunately -- it'll have to be in the next article.

 

 

 

Value proposition

As an independent consultant, I get to work with a lot of organizations in both public and private sectors. Most deal with various levels of security classification.

Governance is always a hot topic with SharePoint. Most understand the importance of governance; some shrug it off as a "we'll deal with it when it becomes a problem" -- which is never a good idea, as far as I'm concerned.

But what if we could make applying governance in SharePoint a lot easier? So easy, in fact, that it would be more painful to deal with it when it becomes a problem.

That's what I hope to do with this series of blog articles: demonstrate easy ways to introduce some level of governance using new enabling technologies -- like SPFx web parts, extensions, and site scripts.

My goal is not to duplicate the work of Microsoft and others; I may use a very simple approach in this first blog to keep the example easy to understand, but I fully intend on leveraging out-of-the-box Office 365 features like Data Loss Prevention (DLP) features.

I hope you'll stick with me for the journey!

Information security classification

Information security classification or information classification is a step in the process of managing information. There are people who are way smarter about this topic, and there is a whole ISO 27001 standard on the topic, so I'll avoid a detailed explanation.

…But I'll definitely throw in a gratuitous graphic. I guess my time McKinsey & Company rubbed off on me.

Managing classified information typically consists of 4 steps:

  • Asset inventory: finding out what kind of information your organization has, and who is responsible for it.
  • Information classification: identifying how sensitive the information is. How bad would it be if this information was leaked, it's integrity compromised, etc. There is no one way to classify information -- it depends on your organization size, industry, country, etc. The most frequently use examples are:
    • Confidential: top confidentiality level
    • Restricted: medium confidentiality level
    • Internal use: lowest level of confidentiality
    • Public: everyone can see the information
  • Information labelling: you kinda need to tell your employees how the information is classified so that they can handle it properly.
  • Information handling: where you define rules and processes around how to handle the information.

This article will focus on the information handling part of the process.

Microsoft's information classification

Microsoft internally classifies their information as follows:

    • High Business Impact (HBI): Authentication / authorization credentials (i.e., usernames and passwords, private cryptography keys, PIN’s, and hardware or software tokens), and highly sensitive personally identifiable information (PII) including government-provided credentials (i.e. passport, social security, or driver’s license numbers), financial data such as credit card information, credit reports, or personal income statements, and medical information such as records and biometric identifiers.
    • Moderate Business Impact (MBI): Includes all personally identifiable information (PII) that is not classified as HBI such as: Information that can be used to contact an individual such as name, address, e-mail address, fax number, phone number, IP address, etc; Information regarding an individual’s race, ethnic origin, political opinions, religious beliefs, trade union membership, physical or mental health, sexual orientation, commission or alleged commission of offenses and court proceedings.
    • Low Business Impact (LBI): Includes all other information that does not fall into the HBI or MBI categories.

A while ago, Microsoft also released on GitHub some cool solution to apply their classification on SharePoint sites.  They also have a great case study that shows how they approached classification on their own content.

So, since I want to keep things simple, I'll use HBI, MBI, and LBI classification labels in my example. You can use your own classification if you want.

Using SPFx extensions to add a header

If you read my equally long post on creating SPFx extensions, you'll know that you can use SPFx extensions to do cool things on every page of a site. To keep this example really simple, I'll create a header that reads the site's property bag and displays a very simple Office Fabric UI Message Bar indicating the site's classification. It isn't going to be particularly pretty, but we can improve on looks later.

The bar will say "This site is classified as [LBI|MBI|HBI]. Learn more about the proper handling procedures.", but you can make it say whatever is appropriate for you.

Here is what the HBI header will look like:
HBI header

The MBI header:
MBI header

And the LBI header:
LBI header

In the next article, we'll start writing the code.

 

Value proposition

As an independent consultant, I get to work with a lot of organizations in both public and private sectors. Most deal with various levels of security classification.

Governance is always a hot topic with SharePoint. Most understand the importance of governance; some shrug it off as a "we'll deal with it when it becomes a problem" -- which is never a good idea, as far as I'm concerned.

But what if we could make applying governance in SharePoint a lot easier? So easy, in fact, that it would be more painful to deal with it when it becomes a problem.

That's what I hope to do with this series of blog articles: demonstrate easy ways to introduce some level of governance using new enabling technologies -- like SPFx web parts, extensions, and site scripts.

My goal is not to duplicate the work of Microsoft and others; I may use a very simple approach in this first blog to keep the example easy to understand, but I fully intend on leveraging out-of-the-box Office 365 features likeData Loss Prevention (DLP) features.

I hope you'll stick with me for the journey!

Information security classification

Information security classification or information classification is a step in the process of managing information. There are people who are way smarterabout this topic, and there is a whole ISO 27001 standard on the topic, so I'll avoid a detailed explanation.

But I'll definitely throw in a gratuitous graphic. I guess my time McKinsey & Company rubbed off on me.

Managing classified information typically consists of 4 steps:

  • Asset inventory: finding out what kind of information your organization has, and who is responsible for it.
  • Information classification: identifying how sensitive the information is. How bad would it be if this information was leaked, it's integrity compromised, etc. There is no one way to classify information -- it depends on your organization size, industry, country, etc. The most frequently use examples are:
    • Confidential: top confidentiality level
    • Restricted: medium confidentiality level
    • Internal use: lowest level of confidentiality
    • Public: everyone can see the information
  • Information labelling: you kinda need to tell your employees how the information is classified so that they can handle it properly.
  • Information handling: where you define rules and processes around how to handle the information.

This article will focus on the information handling part of the process.

Microsoft's information classification

Microsoft internally classifies their information as follows:

  • High Business Impact (HBI): Authentication / authorization credentials (i.e., usernames and passwords, private cryptography keys, PIN’s, and hardware or software tokens), and highly sensitive personally identifiable information (PII) including government-provided credentials (i.e passport, social security, or driver’s license numbers), financial data such as credit card information, credit reports, or personal income statements, and medical information such as records and biometric identifiers.
  • Moderate Business Impact (MBI): Includes all personally identifiable information (PII) that is not classified as HBI such as: Information that can be used to contact an individual such as name, address, e-mail address, fax number, phone number, IP address, etc; Information regarding an individual’s race, ethnic origin, political opinions, religious beliefs, trade union membership, physical or mental health, sexual orientation, commission or alleged commission of offenses and court proceedings.
  • Low Business Impact (LBI): Includes all other information that does not fall into the HBI or MBI categories.

A while ago, Microsoft also released on GitHub somecool solution to apply their classification on SharePoint sites.  They also have a great case study that shows how they approached classification on their own content.

So, since I want to keep things simple, I'll use HBI, MBI, and LBI classification labels in my example. You can use your own classification if you want.

Using SPFx extensions to add a header

If you read my equally long post on creating SPFx extensions, you'll know that you can use SPFx extensions to do cool things on every page of a site. To keep this example really simple, I'll create a header that reads the site's property bag and displays a very simple Office Fabric UI Message Bar indicating the site's classification. It isn't going to be particularly pretty, but we can improve on looks later.

The bar will say "This site is classified as [LBI|MBI|HBI]. Learn more about the proper handling procedures.", but you can make it say whatever is appropriate for you.

Here is what the HBI header will look like:
HBI header

The MBI header:
MBI header

And the LBI header:
LBI header

Ready? Let's get coding!

Creating the SPFx extension solution

  1. Using the command line, create a new project directory
md classification-extension
  1. Change the current directory to your new project directory
cd classification-extension
  1. Launch the Yeoman SharePoint Generator:
yo @Microsoft/sharepoint
  1. When prompted for the solution name, accept the default classification-extension.
  2. For the baseline package select SharePoint Online only (latest).
  3. When asked Where do you want to place the files? accept the default Use the current folder.
  4. When asked if you want to allow the tenant admin the choice of being able to deploy the solution to all sites immediately respond Yes (unless you really want to deploy it to every single site manually).
  5. When asked for the type of client-side component to create select Extension.
  6. Select Application Customizer when asked aboutWhich type of client-side extension to create.
  7. Almost there. For Application Customizer name, use ClassificationExtension. Keep this name to less than 40 characters always.
  8. For Application Customizer description, enter Displays the site's information security classification.
  9. What the miracle that is Yeoman creating the project for you. It'll take a few minutes. Eventually, it'll say Congratulations! Solution classification-extension is created. Run gulp serve to play with it!. We're not quite ready, yet.
  10. Let's launch Visual Studio Code and open the new project you created. From the command line, type:
code .

§§cs§§

An awesome part of SPFx is the ability to create SharePoint Framework Extensions. At the time of this writing, you can write three types of SPFx extensions:

  • Application customizers: to add scripts to pages and access HTML to predefined (well-known) HTML elements. At the moment, there are only a few page placeholders (like headers and footers), but I'm sure the hard-working SPFx team will announce new ones soon enough. For example, you can add your own customized copyright and privacy notices at the bottom of every modern page.
  • Field customizers: to change the way fields are rendered within a list. For example, you could render your own sparkline chart on every row in a list view.
  • Command sets: to add commands to list view toolbars. For example, you could add a button to perform an action on a selected list item.

This articles doesn't try to explain how to create extensions -- there are many great examples on the SharePoint Framework Extensions Samples & Tutorial Materials GitHub repo, and the Overview of SharePoint Framework Extensions tutorial is a pretty place to start if you haven't played with extensions.

In this article, I'll share a Powershell script I use to deploy to many sites at once.

But first, a few things you need to know:

  • To deploy an extension, you need to first deploy the solution (.sppkg) containing the extension, then add a custom user action to your site, web, or list. In other words, tell the site, web, or list to use the extension that you deployed in the solution. There are no user interfaces to add custom user actions.
  • When you add a custom user action, you can pass configuration properties to your extension.
  • It is possible to add a custom user action to the same site, web, or list more than once (because you could pass different configuration properties every for every instance).
  • You can also specify a JSON file in your solution that will automatically deploy and add the custom user action, but you can't customize the configuration properties.

When you have a SharePoint tenant with lots and lots of sites, and you need to provide different configuration properties for each site, it can become painful to deploy an extension everywhere.

Sure, the solution deployment step is easy, just make sure that your solution-package.json has "skipFeatureDeployment": true, and SharePoint will kindly offer to automatically deploy your solution to every site for you.

But to add an extension as a custom user action and provide configuration properties, you need to call a command or use some scripts:

When I need to do just one site, I'll often use the SPFx-extensions-cli, but when I need to do a whole bunch of sites, I like to use the PnP PowerShell cmdlets and PowerShell.

The idea came from the RegionsFooterProvisionCustomizer.ps1 script on Paolo Pialorsi's awesome Regions Footer Application Customizer example, which goes like this:

$credentials = Get-Credential
Connect-PnPOnline "https://.sharepoint.com/sites/" -Credentials $credentials

$context = Get-PnPContext
$web = Get-PnPWeb
$context.Load($web)
Execute-PnPQuery

$ca = $web.UserCustomActions.Add()
$ca.ClientSideComponentId = "67fd1d01-84e8-4fbf-85bd-4b80768c6080"
$ca.ClientSideComponentProperties = "{""SourceTermSetName"":""Regions""}"
$ca.Location = "ClientSideExtension.ApplicationCustomizer"
$ca.Name = "RegionsFooterCustomAction"
$ca.Title = "RegionsFooterCustomizer"
$ca.Description = "Custom action for Regions Footer Application Customizer"
$ca.Update()

$context.Load($web.UserCustomActions)
Execute-PnPQuery

Now Paolo's script will only work for his extension, but you can easily go in and change the ClientSideComponentIdClientSideComponentPropertiesName, Title and Description and make it your own. And if you mistakenly re-run the script for the same site twice, the extension will appear twice.

But I wanted to repeat this for each site on one of my tenant's bazillion sites, and provide different configuration properties -- if necessary. I also wanted to be able to re-run the script as many times as I wanted. Finally, I wanted the customer to be able to simply provide a CSV with a list of sites where they wanted the extensions applied.

So I made tweaked Paolo's code to read the list of sites from aCSV file and apply the extension to each site. I borrowed a lot of this script from another example on the SharePoint Framework Extensions Samples & Tutorial Materials GitHub repo, but I can't find it anymore, so I can't tell who I should give the credit to.  Let me know in the comments if you know who deserves the credits. I'm lazy, but I'm not a thief 🙂

First, make sure that you install the PnP PowerShell cmdlets on your workstation.

Then create a new PowerShell file and copy this code into it:


$credentials = Get-Credential

# Import the list of sites where we want to apply 
$sitesToProcess = import-csv "sites.csv"

# details of custom action/SPFx extension
[guid]$spfxExtId = "[extension id goes here]"
[string]$spfxExtName = "[extension name goes here]"
[string]$spfxExtTitle = "[extension title goes here]"
[string]$spfxExtGroup = "[extension group goes here]"
[string]$spfxExtDescription = "[extension description goes here]"
[string]$spfxExtLocation = "ClientSideExtension.ApplicationCustomizer"
[string]$spfxExtension_Properties = "[properties JSON goes here]"

function Add-CustomActionForSPFxExt ([string]$url, $clientContext) {
    Write-Output "-- About to add custom action to: $url"

    $rootWeb = $clientContext.Web
    $clientContext.ExecuteQuery()
    $customActions = $rootWeb.UserCustomActions
    $clientContext.Load($customActions)
    $clientContext.ExecuteQuery()

    $custAction = $customActions.Add()
    $custAction.Name = $spfxExtName
    $custAction.Title = $spfxExtTitle
    $custAction.Description = $spfxExtDescription
    $custAction.Location = $spfxExtLocation
    $custAction.ClientSideComponentId = $spfxExtId
    $custAction.ClientSideComponentProperties = $spfxExtension_Properties
    $custAction.Update()
    $clientContext.ExecuteQuery()

    Write-Output "-- Successfully added extension" 	
	
    Write-Output "Processed: $url"
}
function Remove-CustomActionForSPFxExt ([string]$extensionName, [string]$url, $clientContext) {
    Write-Output "-- About to remove custom action with name '$($extensionName)' from: $url"

    $actionsToRemove = Get-PnPCustomAction -Web $clientContext.Web | Where-Object {$_.Location -eq $spfxExtLocation -and $_.Name -eq $extensionName }
    Write-Output "-- Found $($actionsToRemove.Count) extensions with name $extensionName on this web." 	
    foreach ($action in $actionsToRemove) {
        Remove-PnPCustomAction -Identity $action.Id
        Write-Output "-- Successfully removed extension $extensionName from web $url." 	
    }

    Write-Output "Processed: $url"
}

# -- end functions --

foreach ($site in $sitesToProcess) {
    $ctx = $null
    $url = $site.Url
    try {
        Connect-PnPOnline -Url $url -Credentials $credentials
        Write-Output ""
        Write-Output "Authenticated to: $url"
        $ctx = Get-PnPContext
    }
    catch {
        Write-Error "Failed to authenticate to $url"
        Write-Error $_.Exception
    }

	# Make sure have a context before continuing
    if ($ctx) {
		# Find out if the extension is already added
		$existingActions = Get-PnPCustomAction -Web $ctx.Web | Where-Object {$_.Location -eq $spfxExtLocation -and $_.Name -eq $spfxExtName }
		
		# Count how many existing extensions we found
		$count = $($existingActions.Count)
		
		# Don't re-install extension if it is already there
        if ($count -ge 1) {
			#This assumes that you don't want to duplicate extensions. If you do, feel free to change the logic below
            if ($count -eq 1) {
                Write-Output "Extension is already applied"
            }
            else {
                Write-Warning "Extension is duplicated!"
            }
        }
        else {
			# Add the extension
			Add-CustomActionForSPFxExt $url $ctx
			Write-Output "-- Successfully added extension $spfxExtName to web $url."
        }
		
        #Add-CustomActionForSPFxExt $url $ctx
        #Remove-CustomActionForSPFxExt $spfxExtName $site $ctx
        #Get-PnPCustomAction -Web $ctx.Web | Where-Object {$_.Location -eq "ClientSideExtension.ApplicationCustomizer" }
    }
}

Making sure to replace all the [sections in bold] with your own information. I get the name and id from the extension's manifest.json file.

Then, create a CSV file containing all the sites you want to get the extension. It should look like this:

Url
https://yourtenantgoeshere.sharepoint.com/sites/Employee
https://yourtenantgoeshere.sharepoint.com/sites/Employee/About
https://yourtenantgoeshere.sharepoint.com/sites/Employee/Calendars
https://yourtenantgoeshere.sharepoint.com/sites/Employee/Learning
https://yourtenantgoeshere.sharepoint.com/sites/Employee/FAQs
https://yourtenantgoeshere.sharepoint.com/sites/Employee/Learning
https://yourtenantgoeshere.sharepoint.com/sites/Employee/News
https://yourtenantgoeshere.sharepoint.com/sites/Employee/InformationTechnology
https://yourtenantgoeshere.sharepoint.com/sites/Employee/MarketingAndCommunications
https://yourtenantgoeshere.sharepoint.com/sites/Employee/Security
https://yourtenantgoeshere.sharepoint.com/sites/Employee/EnvironmentalSustainability
https://yourtenantgoeshere.sharepoint.com/sites/Employee/HealthAndSafety
https://yourtenantgoeshere.sharepoint.com/sites/Employee/Fundraising
https://yourtenantgoeshere.sharepoint.com/sites/Employee/Glossary
https://yourtenantgoeshere.sharepoint.com/sites/Employee/Parking
https://yourtenantgoeshere.sharepoint.com/sites/Employee/purchasing

Using your own urls, and saving it as sites.csv in the same folder as the PowerShell script.

Then you can run the script and it'll connect to every site and apply the extension and provide the configuration properties, but only if the extension hasn't already been installed.

You could also tweak the script and the CSV to pass different configuration properties for each site, but I'll reserve it for another post.

Leave me a comment if you'd like me to post it.

I hope it helps!

As the World's Laziest Developer, I don't like to invent anything new if I can find something that already exists (and meets my needs).

This article is a great example of that mentality. I'm really standing on the shoulder of giants and combining a few links and re-using someone else's code (with credit, of course) to document what my approach to versioning SPFx packages is, with the hope that it helps someone else.

CHANGELOG.md: a standard way to communicate changes that doesn't suck

The problem with change logs

There are a few ways to communicate changes when working on a project: you can use your commit log diffs, GitHub Releases, use your own log, or any other standard out there.

The problem with commit log diffs is that, while comprehensive, they are an automated log of changes that include every-single-change. Log diffs are great for documenting code changes, but if you have a team of developers merging multiple commits every day between versions, they aren't great at summarizing the noteworthy differences.

GitHub Releases solves a part of this problem by making it easy to manually (or automatically) creating release notes with git tags. (f you haven't looked into GitHub Releases, it is awesome --  take a look!.

However, GitHub Releases is still not very user-friendly (or manager-friendly).

You can always write your own change log format, but why not adopt a format and structure that you can use consistently across projects & teams?

CHANGELOG.md

This is where CHANGELOGs come in. According to Olivier Lacan at KeepAChangeLog.com, a changelog is...

"a file which contains a curated, chronologically ordered list of notable changes for each version of a project."

Changelogs use the markdown syntax to make it easy to maintain. They follow a few principles (again, credit to KeepAChangeLog.com):

  • They are for humans not machines: they should be easy to read and quickly make sense of relevant changes.
  • There should be an entry on every single version:
    • Latest version comes first: List versions in reverse-chronological order, makes it easier to see what matters.
    • Release date of each version is displayed: use a consistent ISO standard date format (e.g.: 2018-04-16).
    • Versions should be linkable: becomes handy when you have a giant changelog. Just wrap your version number with [] (e.g.: [0.0.1]).
    • Changes should be grouped by type of change: group you changes into Added, Changed, Deprecated, Removed, Fixed, and Security. Only include the groups of change types you have (no need to have a Deprecated section if you don't have any deprecated-type changes).
  • Mention whether you follow Semantic Versioning: You should, by the way.

How to use CHANGELOG.md in your SPFx project

  1. Add a new file in your project -- wherever you put your README.md) and call it CHANGELOG.md.
    (Sure, you can name your changelog whatever you want, but the whole point of a changelog is to make it easy to find the changes on any projects, consistently. Just name it CHANGELOG.md. Trust me.)
  2. Paste this template in the new file you created:
All notable changes to this project will be documented in this file.

The format is based on [Keep a Changelog](http://keepachangelog.com/en/1.0.0/)
and this project adheres to [Semantic Versioning](http://semver.org/spec/v2.0.0.html).

## [Unreleased]
### Added

- (List new added features)

### Changed

- (List changes to existing functionality)

### Deprecated

- (List soon-to-be removed features)

### Removed

- (List features removed in this version)

### Fixed

- (List bugs fixed in this version)

### Security

- (List vulnerabilities that were fixed in this version)
  1. As you work, keep a log of your changes in the Unreleased section, making sure to put the changes under their respective change types. If you want, you can even link to commits, but I don't.
  2. When you change your solution version, create a new section version entry below the Unreleased section. For example, for version 0.0.1 created April 16, 2018, insert the following text below the unreleased version:

## [0.0.1] - 2018-04-16

Remember that not everyone is an American-born, native English speaker. Use the ISO Standard format for dates. The French-Canadian in me thanks you.

  1. Copy all the changes from Unreleased to your new version section, making sure to remove any empty change type sections. For example, if you don't have any deprecated changes, remove the ### Deprecated section.
  2. This is what the final version of your CHANGELOG.md would look like:
All notable changes to this project will be documented in this file.

The format is based on [Keep a Changelog](http://keepachangelog.com/en/1.0.0/)
and this project adheres to [Semantic Versioning](http://semver.org/spec/v2.0.0.html).

## [Unreleased]

## [0.0.1] - 2018-04-16

### Added
- (List new added features)

### Changed
- (List changes to existing functionality)

### Removed
- (List features removed in this version)

### Fixed
- (List bugs fixed in this version)

### Security
- (List vulnerabilities that were fixed in this version)
  1. Copy back the section templates to the Unreleased section and continue steps 3-7 with every new version.

Semantic versioning

I have worked with Microsoft technologies as long as I can remember, so it is ingrained in me that every version number should consist of 4 parts: Major, Minor, Build, Revision. For example, 1.0.0.0.

When you package an SPFx solution, the solution version always starts with version 1.0.0.0, and you can't make it lower than that. (Well, you can, but SharePoint will ignore it and it will become version 1.0.0.0).

Imagine my horror when, one day, I was trying to change the version number of a solution and searched for 1.0.0 and found that the NodeJS package also has its own version, stored in a file called package.json. What's worse, it didn't even have 4 parts!

The heresy!

After my initial indignation, I decided to research this and found that the versioning schema is called Semantic Versioning (or sem-ver for short). It consists of three mandatory parts: Major, Minor, Patch, plus an optional label for pre-release and build metadata. For example, you could have a version 1.0.0-rc for a release candidate version.

Hmmm, makes it easier to keep track of versions. And it is more human-readable, which is always good.

To keep things even more confusing, each web part can have its own version number. While there are valid reasons why you would want to keep the package version, the solution version and the web part versions separate, it quickly becomes impossible to keep track of versions.

To keep things clean, it makes sense to keep version numbers in sync.

npm version

Luckily, makes it easy to update your package.json version by simply calling:

npm version <major|minor|patch>

Where you specify to increase either the major, minor, or patch version.

For example, if you start with a package.json version 0.0.3 and want to increase the major version, you'd call:

npm version major

Which would produce v1.0.0.

If only there was a way to make it this easy to synchronize the package.json version to the package-solution.json version.

If only someone way smarter than I had thought of this...

Sync npm version with package-solution.json

It turns out there is such a person: Stefan Bauer!

In his blog post, he shares a way to add a Gulp function that automatically syncs the package.json version with the package-solution.json.

(Thanks Stefan for being awesome!)

To add this Gulp function, do the following steps:

  1. In your SPFx project, open gulpfile.js
  2. Before build.initialize(gulp); add my slightly modified version of Stefan's code. If it works, credit goes to Stefan. If it fails, it was my changes.
    let syncVersionsSubtask = build.subTask('version-sync', function (gulp, buildOptions, done) {
      this.log('Synching versions');
    
      // import gulp utilits to write error messages
      const gutil = require('gulp-util');
    
      // import file system utilities form nodeJS
      const fs = require('fs');
    
      // read package.json
      var pkgConfig = require('./package.json');
    
      // read configuration of web part solution file
      var pkgSolution = require('./config/package-solution.json');
    
      // log old version
      this.log('package-solution.json version:\t' + pkgSolution.solution.version);
    
      // Generate new MS compliant version number
      var newVersionNumber = pkgConfig.version.split('-')[0] + '.0';
    
      if (pkgSolution.solution.version !== newVersionNumber) {
        // assign newly generated version number to web part version
        pkgSolution.solution.version = newVersionNumber;
    
        // log new version
        this.log('New package-solution.json version:\t' + pkgSolution.solution.version);
    
        // write changed package-solution file
        fs.writeFile('./config/package-solution.json', JSON.stringify(pkgSolution, null, 4));
      }
      else {
        this.log('package-solution.json version is up-to-date');
      }
      done();
    });
    
    let syncVersionTask = build.task('version-sync', syncVersionsSubtask);
    
    build.rig.addPreBuildTask(syncVersionTask);
  3. Save your file

The final gulpfile.js should look like this:

'use strict';

const gulp = require('gulp');
const build = require('@microsoft/sp-build-web');

build.addSuppression(`Warning - [sass] The local CSS class 'ms-Grid' is not camelCase and will not be type-safe.`);

//BEGIN: Added code for version-sync
let syncVersionsSubtask = build.subTask('version-sync', function (gulp, buildOptions, done) {
  this.log('Synching versions');

  // import gulp utilits to write error messages
  const gutil = require('gulp-util');

  // import file system utilities form nodeJS
  const fs = require('fs');

  // read package.json
  var pkgConfig = require('./package.json');

  // read configuration of web part solution file
  var pkgSolution = require('./config/package-solution.json');

  // log old version
  this.log('package-solution.json version:\t' + pkgSolution.solution.version);

  // Generate new MS compliant version number
  var newVersionNumber = pkgConfig.version.split('-')[0] + '.0';

  if (pkgSolution.solution.version !== newVersionNumber) {
    // assign newly generated version number to web part version
    pkgSolution.solution.version = newVersionNumber;

    // log new version
    this.log('New package-solution.json version:\t' + pkgSolution.solution.version);

    // write changed package-solution file
    fs.writeFile('./config/package-solution.json', JSON.stringify(pkgSolution, null, 4));
  }
  else {
    this.log('package-solution.json version is up-to-date');
  }
  done();
});

let syncVersionTask = build.task('version-sync', syncVersionsSubtask);

build.rig.addPreBuildTask(syncVersionTask);
//END: Added code for version-sync

build.initialize(gulp);

Next time you build your package, the Gulp task version-sync will grab the package.json version (which you updated using npm version, right?) and will update package-solution.json, adding an extra zero at the end of the version number to Microsoftify the version.

When you get the version number, go update your CHANGELOG.md file by moving the changes from [unreleased] to a new section with the new version number you just created.

Sync package-solution.json version with webpart.manifest.json version

So far, we have done the following:

  • Created a CHANGELOG.md of unreleased changes
  • Maintained version number using npm version
  • Synchronized package.json versions with package-solution.json versions
  • Updated your CHANGELOG.md to describe the changes you made

But there is still a little annoying thing: the web part versions (stored in webpart.manifest.json,  where webpart is the name of your web part) can be different than the package.json and package-solution.json.

Turns out that it is pretty easy to fix:

  1. In your SPFx solution, open webpart.manifest.json where webpart is the name of your web part. For example, HelloWorldWebPart.manifest.json for HelloWorldWebPart.
  2. Find the "version" line and replace whatever version you have in there for "*", making it:
"version": "*",

Doing so will cause the version of the webpart.manifest.json to match the package-solution.json version.

(Turns out that the latest version of SPFx documents this by adding the following comment on the line above "version": "*".

// The "*" signifies that the version should be taken from the package.json
"version": "*",

How cool is that?!

Conclusion

By using CHANGELOG.md to keep track of changes between versions, and using semantic versioning for your versions, you can make it pretty easy to document your changes across versions.

By using npm version, you can easily maintain the semantic version of your package.json.

By using Stefan's cool version-sync Gulp command, you can easily sync your package.json version and your package-solution.json.

By using "version": "*", you can synchronize your package-solution.json and your webpart.manifest.json versions.

Finally, by not reinventing the wheel and by leveraging the hard-work of other people, you can do it all with very little effort!

I hope this helps you?!

This is an easy one, but I keep Googling it.

When you create an SPFx web part, the default Property Pane automatically submits changes to the web part. There is no "Apply" button.

Property Pane without Apply
Default property pane -- no Apply button
But sometimes you don't want changes to the property pane fields to automatically apply.

All you have to do is to add this method in your web part class (just before

getPropertyPaneConfiguration is where I like to place it):
protected get disableReactivePropertyChanges(): boolean {
	return true;
}

When you refresh the web part, your property pane will sport a fancy Apply button!

 

PropertyPaneWithApply.png
Property pane with an Apply button

Property changes in the property pane will only get applied when users hit Apply.

 

That's it!

Hub sites?

Unless you're a SharePoint geek like me, you may not have been eagerly waiting for this new feature announced at Ignite 2017 in Orlando. Hub sites are a special site template that allows you to logically group team sites and communication sites under another site, with a shared navigation, theme, and logo.

Hub sites will also aggregate news and activities from any sites associated to it, and you can search within a scope of a hub site and it's associated sites.

The picture Microsoft used in their announcement explains it best:

hubbahubba

The Problem

The typical corporate intranet is often nothing more than a re-hash of the company's corporate organization structure, blindly copied to a web site accessible to employees. If that intranet is done using SharePoint or Office 365, it'll consist of a bunch of site collections with some sub-sites.

(By the way, I completely disagree with using the org chart for your intranet structure, but I'll save it for another blog post).

What happens when your company restructures for (insert official reason here)? Let's say that you had a whole bunch of Divisions, each with their own site (or site collection) and they completely change the divisions every quarter (like the CEO of a former client of mine liked to do).

What happens when the IT, Finance, and HR team are no longer in the same groups?

You end up having to either:
a) Move sites around, break a lot of people's favourite shortcuts and links; or
b) Leave everything the way it is and give up hope

Or, you could create a structure that doesn't need to change with the org-chart-of-the-week by using a flat structure. Since the new modern sites in Office 365, it is a lot easier to create groups, team sites and communication sites in a rather "flat" structure (every site is created in their own site collection, located under https://yourtenant.sharepoint.com/sites/ or https://yourtenant.sharepoint.com/teams/).

So, now you end up with a flat site structure that doesn't need to change when your information architecture changes again, but there is no easy way to navigate through this flat structure.

You can hack together some sort of global navigation with custom code and/or scripts, but every time someone wants to add a new site, you need to change the code.

The Solution

SharePoint Hub Sites allows you to continue creating a flat structure and logically group sites together in a semi-hierarchical fashion.

There are caveats:

  • As of this writing, you can only have up to 50 hub sites on your tenant.
  • You can add sites to hub sites, but you can't add hub sites to hub sites. And don't get me started about hub sites under hub sites under hub sites.
  • You need to be a SharePoint admin to create hub sites, but you can control who can add sites to what hub sites.
  • You'll need to do some Powershell.

Demonstration

We are going to create an Employee Matters hub, which will be the go-to place for employees to find resources related to being an employee of [XYZ Corp].

It will contain the following sites:

  • Benefits
  • Jobs
  • Training

Before you start

Download and install the latest SharePoint Online Management Shell.

Create "Sub" Communication Sites

  1. From your Office 365 environment, create a Communication site by going to the waffle
    waffle
    | SharePoint | Create site.
    createsite1
  2. From the Create site panel, select Communication site. It also works with Team sites.create site 2
  3. Choose the Topic layout and name the site Benefits. Give it a description if you'd like. Select Finish.
    Createsite3
  4. Repeat steps 1-3 above with Jobs and Training (or anything else you'd like to do), making sure to remember the url of every site you create (you'll need to go back to the sites you just created later).

Create a (future) hub site

Repeat steps 1-3 above again, but this time call the site Employee Matters. This will be the site that will be converted to a hub site. Make note of the site's url.

Register the hub site

  1. Start the SharePoint Online Management Shell.
    SPOMS
  2. From the PowerShell command prompt, type:
    Connect-SPOService -url https://-admin.sharepoint.com

    where is your own SharePoint tenant. Note that we're connecting to the Admin site, not the regular .sharepoint.com site.

  3. Once connected (you'll be prompted to login, probably), type:
    Register-SPOHubSite -site https://.sharepoint.com/sites/employeematters

    ...making sure to use the url of the Employee Matters you created earlier. Note that this time, we are not using the -admin.sharepoint.com domain, just the regular .sharepoint.com domain.

  4. If all goes well, you'll get something like this:
    ID : 2be153d3-0fe8-4fb8-8fa0-b41dfdd8bd3f
    Title : Employee Matters
    SiteId : 2be153d3-0fe8-4fb8-8fa0-b41dfdd8bd3f
    SiteUrl : https://.sharepoint.com/sites/EmployeeMatters
    LogoUrl :
    Description :
    Permissions :
  5. Memorize the GUIDs. Just kidding! You can pretty much ignore the response -- as long as it didn't start spewing red text, you're doing fine.

At this point, if you got an error saying Register-SPOHubSite is not a valid command, you probably haven't installed the latest version of the SharePoint Online Management Shell.

If it gives you an error saying that hub sites aren't yet supported, go have a big nap and try again tomorrow.

You can go visit your newly created hub site. It should look like this:
employeematters1.png

It doesn't look much different than any other communication site, but it has an extra navigation bit at the top:

hubsite2

If your site hasn't updated yet, wait a little bit. Some of the changes take up to 2 hours, but every time I have done this, it was instant.

Optional: Set your hub site icon and description

You don't have to do this, but it is generally a good idea to label your sites and give them a custom icon. To do so:

  1. Upload an icon of your choice to a library of your choice (for this demo, I created a document library called Site Assets in the Employee Matters site). Make note of the url to the icon. The icon should be 64x64 pixels.
  2. From the SharePoint Online Management Shell thingy, enter the following:
    Set-SPOHubSite -Identity https://.sharepoint.com/sites/employeematters -LogoUrl https://.sharepoint.com/sites/employeematters/site%20assets/employeemattersicon.png -Description "Find resources for employees"

    Making sure to replace the LogoUrl for the url to the icon you want (and making sure that you put whatever description you want for the site hub).

  3. Your site hub will eventually get updated. Go take a look.

By the way, there is a user interface to change the site hub logo, but there isn't one to change the description. You can get to it by following these steps:

  1. Using your browser, go to your site hub.
  2. From the site hub home page, select the settings gear and select Hub site settings
    hubsite3.png
  3. From the Edit hub site settings pane that appears, you can change the icon or the site hub title. Not the description.
    hubsite4
  4. Select Save and your changes will (eventually) be reflected.

Associate "sub" sites to hub site using your browser

  1. Go to the Benefits site you created what seems like a million years ago.
  2. From the settings gear icon, select Site information
    sitesettings1
  3. From Edit site information pane that appears, select the Employee Matters hub site from the Hub site association, then select Save.
    Note thasitesettings2Note that, in real life, only users who have been granted the rights to join a site will be able to do this -- but that's another blog post. Also, note that changing the hub site will change the site theme to match the hub site and add its navigation (as is clearly indicated on the Edit site information pane).

You should notice that your Benefits site will now have the Employee Matters navigation added at the top. That means it worked.

Associate "sub" site to hub site using PowerShell

  1. From the SharePoint Online Management Shell, enter the following:
    Add-SPOHubSiteAssociation -Site https://.sharepoint.com/sites/Jobs -HubSite https://.sharepoint.com/sites/EmployeeMatters

It will associate the Jobs site to the Employee Matters hub. Note that the -Site parameter is the site you want to add to the hub site, while the -HubSite parameter is the hub site.

Use either the PowerShell method or the browser method to add the Training site to the hub site.

Add links to the hub site navigation

The sites associated to your hub site now sport the new fancy hub site navigation, showing Employee Matters, but you'll notice that the navigation did not get updated to show the newly associated sites.

To fix this:

  1. Go to your hub site's home page. You can do so by clicking on Employee Matters from any of your associated sites.
  2. From the hub navigation (top left corner of the hub site, where it says Employee Matters) select Edit.
  3. From the navigation editing pane that appears, select the button to add a new link.
    fancyplus
  4. In the Add a link pop-up that appears, enter the url to the Jobs site in the Address field, and type in Jobs for the Display name, then select OK.addlink
  5. Repeat until you have added Jobs, Benefits, and Training then hit Save.hubsitenav

Your hub navigation will contain links to each associated site.

News, activities and search results from the hub home will include results from all associated sites, provided that the current user has permissions to each site. It takes a while before the results appear, but they will!

Conclusion

Hub sites are going to be a great addition to SharePoint in Office 365. They aren't going to solve every navigation issues, but they are certainly a step in the right direction.

There is still a lot to cover with theming and security, but that's probably enough for today.

(OR: How to solve the "this property cannot be set after writing has started." error when calling OpenBinaryDirect)

The Problem

I was trying to write a little app to programmatically download files from a SharePoint instance on Office 365 to a local folder on my hard-drive/network file share -- something I've probably done a thousand times -- using this code:

/*
* This code assumes you already have filled the following variables
* earlier in the code
* Code has been simplified for 
*/
var webUrl = "https://yourtenantgoeshere.sharepoint.com/site/yoursitename";
var username = "yourusernamegoeshere@yourtenantgoeshere.com";
var password = "pleasedonteverwriteyourpasswordincode";
var listTitle = "yourdocumentlibrarytitle";
var destinationFolder = @"C:temp";

var securePassword = new SecureString();
//Convert string to secure string
foreach (char c in password) {
    securePassword.AppendChar(c);
}
securePassword.MakeReadOnly();

using (var context = new ClientContext(webUrl))
{
    // Connect using credentials -- use the approach that suits you
    context.Credentials = new SharePointOnlineCredentials(userName, securePassword);

    // Get a reference to the SharePoint site
    var web = context.Web;

    // Get a reference to the document library
    var list = context.Web.Lists.GetByTitle(listTitle);

    // Get the list of files you want to export. I'm using a query
    // to find all files where the "Status" column is marked as "Approved"
    var camlQuery = new CamlQuery
    {
        ViewXml = @"
            Approved
            1000
        "
    };

    // Retrieve the items matching the query
    var items = list.GetItems(camlQuery);

    // Make sure to load the File in the context otherwise you won't go far
    context.Load(items, items2 => items2.IncludeWithDefaultProperties
        (item => item.DisplayName, item => item.File));

    // Execute the query and actually populate the results
    context.ExecuteQuery();

    // Iterate through every file returned and save them
    foreach (var item in items)
    {
        // THIS IS THE LINE THAT CAUSES ISSUES!!!!!!!!
        using (FileInformation fileInfo = Microsoft.SharePoint.Client.File.OpenBinaryDirect(context, item.File.ServerRelativeUrl))
        {
	    // Combine destination folder with filename -- don't concatenate
            // it's just wrong!
            var filePath = Path.Combine(destinationFolder, item.File.Name);

            // Erase existing files, cause that's how I roll
            if (System.IO.File.Exists(filePath))
            {
                System.IO.File.Delete(filePath);
            }

            // Create the file
            using (var fileStream = System.IO.File.Create(filePath))
            {
                fileInfo.Stream.CopyTo(fileStream);
            }
        }
    }
}

The "usings" at the top of the file were:

using System;
using System.Collections.Generic;
using System.Security;
using Microsoft.SharePoint.Client;
using System.IO;

And every time I ran the code, I'd get a really annoying error on the OpenBinaryDirect method:

this property cannot be set after writing has started.

If I wasn't already bald, I would be after searching everywhere how to solve it.

The Solution

As it turns out, when I created my console application, I followed these steps:

  1. Launch Visual Studio
  2. File | New Project... | Console Application and saved the project
  3. On the newly created project, added Microsoft.SharePoint.Client references by right-clicking on the project's References and selecting Manage Nuget Packages and selecting the first nuget reference that had Microsoft.SharePoint.Client that looked semi-official -- you know, the one that says "by Microsoft"

Wrote the code and quickly ran into the aforementioned error.

As it turns out, I needed to use the Nuget package that said Microsoft.SharePointOnline.CSOM (also by Microsoft).

I removed the Microsoft.SharePoint.Client Nuget package and added Microsoft.SharePointOnline.CSOM instead. It automatically included the right Microsoft.SharePoint.Client and Microsoft.SharePoint.Client.RunTime dependencies it needed.

After recompiling, it worked perfectly.

The way it should have done several hours ago.

After a lot of cursing, mostly directed at myself, I decided to write this down as a #NoteToSelf. Next time I run into this issue, at least I'll find a blog entry describing the solution.

My own.

You have a Surface Pro 3, and you've tried to install Windows 10 build 10122, but it hangs at 18% complete (or your PC reboots and reverts to the previous version of Windows), Pieter Wigleven has a solution. His instructions are complete, but I had a few issues getting it to work.

Here are the steps I took to get it to work:

  1. Start the Windows Update process and wait until it starts to download the fbl_impressive update. If you don't start this now, the next few commands may cause Windows Update to install a driver update that seems to be the root cause of the issues.
  2. Create a new folder called Temp on your C: drive.
  3. Download psexec.exe from SysInternals and extract it to c:temp
  4. From your start menu, type Cmd and right-click on the Command Prompt entry that should appear in the results.  Select Run as Administrator. If you get prompted to elevate your privileges, go ahead.
    cmdprompt
  5. From the newly opened command prompt, type:
           C:Temppsexec.exe –s –i cmd.exe
  6. From the second command prompt that will open, type this case sensitive command:
    rundll32.exe pnpclean.dll,RunDLL_PnpClean /DRIVERS /MAXCLEAN
  7. Nothing will happen -- that you can see. That's OK.
  8. Close your command prompt windows and wait for the upgrade to start.

That's all you should need! If you run into problems, restart your Surface and wait until the update has started before you run the commands above -- as stated in step 1. It doesn't seem to work if Windows Update installs the driver updates.

Thanks Pieter!

You may never need this tip, but I recently ran into an issue where my article page's Edit Page button stopped working in SharePoint 2013 (probably something I messed up with the master page... I'll fix it later). I Googled and Binged everywhere, but couldn't find how to switch an article page to edit mode.

All you need to do is append your page URL with the following parameters:

?DisplayMode=Design&ControlMode=Edit

So if your page is:
http://mysharepointserver/pages/tdamnededitbutton.aspx
You would write:
http://mysharepointserver/pages/tdamnededitbutton.aspx?DisplayMode=Design&ControlMode=Edit

I hope it saves someone else from having to search.

Did anyone have any problems with the Edit button not working? Share below!

In my previous article, I discuss best practices on how to choose high resolution photos to use in user profile pictures for Office 365.

You can upload user profile pictures using the Office 365 Admin Center. It may be obvious to everyone else, but I didn’t know this was possible until a very astute coop student showed me this feature (after I spent an afternoon telling him the only way to do this was to use PowerShell). So, to save you the embarrassment, here is the web-based method:

  1. From the Office 365 Admin Center (https://portal.office.com) go to Admin then Exchange.
  2. In the Exchange Admin Center click on your profile picture and select Another User…. from the drop-down menu that appears.
    image
  3. The system will pop-up a window listing users in your Office 365 subscription. Search for the user you wish to change and click OK.
    image
  4. The system will pop-up the user’s profile, indicating that you are working on behalf of the user you selected. Scroll all the way to the bottom and select Edit Information…
    image
    image
  5. Another pop-up window (seriously, disable your pop-up blockers if you haven’t done so already) will the editable user profile page, starting with the Photo section. Click on Change
    image
  6. Click on Browse… and select the picture you wish to use.
    image
  7. Click Save to dismiss the window. Close all the pop-ups.

Repeat for all user profiles pictures you wish to upload. If you have Lync open, you should see the results almost immediately.

The profile picture will also be automatically synched with SharePoint user profiles (at least, that has been my experience… please feel free to comment below if you’ve had different results).

While it may be handy to do a few pictures, if you have to update hundreds of user profile pictures, you may want to use the PowerShell method.

In Office 365, you can upload profile pictures for each user’s contact card. The contact card will appear in Outlook, SharePoint, Lync, Word, Excel, PowerPoint… well, in any Office product that displays contact cards 🙂

Sample Contact Card in Outlook 2013
Sample Contact Card in Outlook 2013

While this isn’t a new concept to Office 2013, and this feature is available in On Premise installations, these articles focus on Office 365.

There are two ways to achieve this:

You’ll find all sorts of confusing information online regarding the dimensions, file size and format restrictions. I found that either of the two methods described in this article will work with almost any file sizes and dimensions.

There are, however, some best practices.

Choose Square Photos

Choose a square image as the source (i.e.: same width and height), otherwise the picture will be cropped when you upload and you may end up with portions of people’s faces being cropped out.

Example of a great picture, wrong shape... (Photo Credit: rubenshito)

Will be automatically cropped to:

Auto-cropped result.

Go for the Max

Lync 2010 supported the ability to view contact photos which were stored as part of the thumbnailPhoto attribute in Active Directory, meaning that pictures could only be 48x48 pixels.

However, Lync 2013 can now store photos in user’s Exchange 2013 mailbox, meaning that it supports images of up to 648x648 pixels.

When you upload a photo to Exchange 2013, it automatically creates 3 versions of the photo:

SizeUsed By
48x48Active Directory thumbnailPhoto attribute
96x96Outlook 2013 Web App
Outlook 2013
Lync Web App
Lync 2013
SharePoint
648x648Lync 2013
Lync Web App

If you only upload a smaller image (e.g.: 48x48), it’ll be scaled to 96x96 and 648x648, resulting in photos that look fuzzy. However, if you upload photos that are already 648x648. The system will automatically generate 48x48 and 96x96 thumbnails for you.

OriginalAuto-Scaled
imageimage image
imageimage image

(Photo Credit: rubenshito)

Note that if you upload a photo to the thumbnailPhoto in Active Directory, the photo will not be updated in Exchange. If you are lazy like me, you probably want to update photos only once.

My recommendation (and Microsoft's) is to use 648x648 pixels, 24-bit JPG images.

Although you can use the web-based GUI to update profile pictures on Office 365, sometimes you need to upload many pictures at once.

This is where PowerShell comes in handy. Here are the instructions to upload high resolution user profile pictures to Office 365 using PowerShell commands:

    1. Launch the PowerShell console using Run as Administrator
      image
    2. In the PowerShell console, provide your Office 365 credentials by typing the following command and hitting Enter:
      $Creds = Get-Credential
    3. You’ll be prompted to enter your credentials. Go ahead, I’ll wait.
    4. Create a PowerShell remote session to Office 365/Exchange by entering the following command and hitting Enter:
               $RemoteSession = New-PSSession -ConfigurationName Microsoft.Exchange
      -ConnectionUri https://outlook.office365.com/powershell-liveid/?proxymethod=rps -Credential $Creds -Authentication Basic
      -AllowRedirection
    5. Initialize the remote session by entering:
               Import-PSSession $RemoteSession
    6. Doing so will import all the required Cmdlets to manage Exchange – this is why you don’t need to install any Exchange PowerShell modules or anything like that.
    7. If you get an error at this time telling you something about script execution not being enabled (or something like that, I never read the actual error message). Enter the following command to enable remotely signed commands:
      Set-ExecutionPolicy RemoteSigned
      

      The above command is only required if you got an error. Some articles may say that you need to set the execution policy to Unrestricted, but – being paranoid – I prefer to limit the policy to remote signed commands. If you got an error while trying to set the execution policy, it is most likely because you forgot to Run as Administrator as indicated in step 1 above. Tsk tsk, pay attention!
      Once you set the execution policy without an error, try step 5 again.

    8. Once the session has been imported, you’ll have new Cmdlets available. The most important one being Set-UserPhoto. But before you need to call Set-UserPhoto, you need to load the photo you want to use. To do so, call:
      $photo = "pathofyourphoto.jpg"
      

      Making sure to replace pathofyourphoto with the file name for the picture you wish to upload

    9. Now you can set the user’s photo by using the following command:
      Set-UserPhoto -Identity "testuser@xyz.com" -PictureData ([System.IO.File]::ReadAllBytes($photo)) -Confirm:$false

      Making sure to replace testuser@xyz.com with the user id of the profile you wish to change.

    10. Repeat steps 8-9 until all your pictures have been uploaded. One of these days, I’ll write a script to iterate through all the pictures. Let me know in comments below if you need that script.
    11. When done, call
      Remove-PSSession $RemoteSession

 
For your convenience, here is the whole PowerShell script:

$Creds = Get-Credential
$RemoteSession = New-PSSession -ConfigurationName Microsoft.Exchange -ConnectionUri https://outlook.office365.com/powershell-liveid/?proxymethod=rps -Credential $Creds -Authentication Basic –AllowRedirection
Import-PSSession $RemoteSession
$photo = “pathofyourphoto.jpg”
Set-UserPhoto -Identity “testuser@xyz.com” -PictureData ([System.IO.File]::ReadAllBytes($photo)) -Confirm:$false
Remove-PSSession $RemoteSession

If you used the PowerShell script above, you’ll be able to upload 648x648 pixel photos without any issues for you and other users. If you didn’t use this script, but you get the following error:

The remote server returned an error: (413) Request Entity Too Large

...it is most likely because you connected to your remote PowerShell session without setting the proxy method.  Compare the two PowerShell commands:

Works Only with Photos 10Kb or Below
$RemoteSession = New-PSSession -ConfigurationName Microsoft.Exchange -ConnectionUri https://outlook.office365.com/powershell-liveid/ -Credential $Creds -Authentication Basic –AllowRedirection
Works with Photos Greater than 10Kb
$RemoteSession = New-PSSession -ConfigurationName Microsoft.Exchange -ConnectionUri https://outlook.office365.com/powershell-liveid/?proxymethod=rps -Credential $Creds -Authentication Basic –AllowRedirection

I hope the information above helped?

For more information

Set-UserPhoto CmdLet
http://technet.microsoft.com/en-us/library/jj218694.aspx

Configuring the use of high-resolution photos in Microsoft Lync Server 2013
https://technet.microsoft.com/en-us/library/jj688150.aspx

Introduction

Microsoft Expression Blend makes it easy launch animations. It is easy to launch animations with a few clicks… But sometimes, you need to launch animations from within your code – for example, to launch an animation after performing calculations. This article will show you how to play an animation from within your code.

Background

If you're doing a SketchFlow application, you can right-click the control you want to launch the animation, select Play SketchFlow Animation, and select the animation you want to trigger.

If you're not doing a SketchFlow application, you can drag a ControlStoryboardAnimation from the Assets pane unto the control you want to trigger the animation.

From the Properties pane, you can then select what Storyboard you want to launch, and what event (EventName) you want to launch the animation:

But when you want to use code to launch an animation, you need to take a few more steps. Here's how:

  1. First: create your animation (d'uh!)
  2. Remember the name you gave the animation. If you can't remember it, open your XAML and look for a Storyboard element. The name of the animation can be found in the x:Key tag.
    <UserControl.Resources>
    <Storyboard x:Key="myAnimation">

    </Storyboard >
    </UserControl.Resources>
  3. Now crack open the code-behind page. We'll write some code!
  4. Make sure that you have System.Windows.Media.Animation in your using section. If not, add it by adding the following line:
    using System.Windows.Media.Animation;
  5. In the event handler where you want to launch your animation, declare an object of type Storyboard and load it from the page's resources, using the animation name as the key, as follows:

    Storyboard myAnimationStoryboard = this.Resources["myAnimation"] as Storyboard;

  6. After you verified that the object you retrieved isn't null, you can start the animation by calling the Storyboard's Begin method.

    myAnimationStoryboard.Begin();

 

That's really all there is to it. I personally like to declare the Storyboard as a member variable and assign it in the constructor, then I use the Storyboard object anywhere I need it.

Once you've developed XAML applications, you'll think this article is silly, but until you do, I hope that it'll save you some searching!

More Information

Learn more about the Storyboard class, including how to control a storyboard at http://msdn.microsoft.com/en-us/library/system.windows.media.animation.storyboard.aspx

Learn about How to Control a Storyboard After It Starts at http://msdn.microsoft.com/en-us/library/ms741997.aspx

 

 


 

A new feature in SharePoint 2010 is that you can customize the form that is displayed when you create a new list item by using InfoPath. That means that you can leverage the extensive capabilities of InfoPath without having to write a single line of code – and that’s a good thing, if you’re as lazy as I am and want to avoid resorting to custom code.

To do it, you simply go to the list you want to customize, click on Customize Form, and edit the InfoPath form that was thoughtfully created for you. Once you’ve published the form, you’ve got a custom form for your list. SharePoint will automatically create a read-only version of your form for displaying items. Easy!

But what if you want to have a different form when creating new items, one for editing items, and one for viewing items? In this post, I’ll show you how to create a different InfoPath view for New, Edit, and Display forms, as pictured below:

Here are the steps:

  1. Go to the list you want to customize (or create a new list). For this example, I’ll be customizing an Issues List.
  2. Select the List ribbon and, from the Customize List group, click on Customize Form.
  3. SharePoint will open InfoPath and load the default form. You can customize it just like you would any other InfoPath form. Just make sure you stick to browser compatible settings, because the form will be loaded using InfoPath Forms Services.
  4. For this sample, I’ll add a title to every form and change the colours of each form (so that we can prove that it works). Let’s treat this form as the New Item form:
  5. We’ll create two more views (one for Edit, and one for Display). To make things less confusing, let’s rename the view. We do this by changing the view’s properties using the following steps: switch to the Page Design ribbon, and select Properties from the Views group.
  6. In the View Properties dialog, change the View name to New item, then click OK. This step is optional, but it’ll make things less confusing later – trust me.
  7. Back on the New item view, select the entire content of the form and copy to the clipboard. We’ll paste the form’s content in a new view.
  8. From the Page Design gallery, click on New View from the Views group.
  9. Name the new view Edit view.
  10. In the newly created view, select all content and replace it with the content you copied from the previous view. Make your changes to the Edit view – in my case, I changed the title of the form to Edit Issue and changed the colour.
  11. Repeat steps 7 to 10, but this time name the newly create view Display view
  12. I’ll change the title and colour of the Display view
  13. SharePoint will automatically display the Display view in read-only mode. Since I’m a control freak, I prefer to create my own read-only view by going to every control, and converting them to Calculated value fields by right-clicking each control, select Change Control and picking Calculated value.
  14. The final Display Issue view look like this:
  15. You can add as many views as you want using the same approach, and use rules to select the appropriate view. For example, you may want to display a different view based on a user’s role. If you do, you probably don’t want users to be able to switch between views. To do so, go to the View Properties for each view, and de-select the Show on the View menu when filling out this form option.
  16. So far, all we did was setting up the different views for New, Edit, and Display. The next few steps will configure which view to use when displaying, editing, and creating a new item. Let’s start with the Display form; go to the File menu, then select Info and Advanced form options.
  17. In the Form Options dialog, select the Web Browser category. In the Display View area, select the view you want to appear when displaying an item (in our sample, it is called Display view).
  18. Unfortunately, to set the Edit and New views, we don’t have an easy option. We can, however, use Form Rules to change the view when the form is loaded. If the ID
    field is blank, we’ll assume that the user is creating a New item. If the ID field is not blank, the user is Editing the item. To do this, switch to the Data ribbon, and click on Form Load in the Rules group.
  19. InfoPath will open the Rules pane (on the right side of the form). Click on New then Action.
  20. Name the rule Switch to New View. Then click on default condition (None – Rule runs when form is loaded) to create a new condition.
  21. From the Condition dialog, change myFields to Select a field or group… then pick ID from the dialog that pops-up. Click OK to return to the Condition dialog.

  22. Change is equal to to is blank then click OK.
  23. Back at the Rules pane, find Run these actions and select Add then Switch views.
  24. From the Rules Details dialog, select New item from the View field and click OK.
  25. You may be tempted to switch the view to Edit Item when the ID
    is not blank, but – if you think about it – this condition would occur both when viewing an item and editing an item. It would make SharePoint switch to the Edit item view when you try to display an item, even if you did set up the Display view option step 17. Don’t do it!
  26. All you need to do now is publish the form! To do so, select the File menu, Info then Quick Publish.
  27. Wait until InfoPath does its thing. You’ll get a message indicating that publishing was successful.
  28. Click on Open the SharePoint list in the browser to test your new forms. Here are the final results:
    What gets displayed with creating a new item:

    When displaying an item:

    When editing an item:

That’s it! Of course, in real life, you’d probably want to customize each form a bit more than just changing the title and colour.

Hope this helps!