Category

Office 365

Category

Introduction

SharePoint client-side web parts (SPFx) allow you to define custom properties that your users can use to customize your web parts.

You can set default values in your web part's manifest.json file so that the web part is already pre-configured when a user adds your web part to a page.

For example, the following (fictitious) Deflatinator web part -- which allows you to shoot a beam that will deflate everything within the Tri-state area has three custom properties:

  • deflateBeachBalls (boolean, default true) controls if it will deflate beach balls
  • deflateBlimps (boolean, default true) controls if it will deflate blimps
  • maxMirrorBounce (number, default 3) controls if the beam can bounce of mirrors (and increase chances that something will go wrong)
  • curseYou (string, default Perry! (what else?)) controls who will be cursed if your plans go wrong.

Your web part's props will be defined as follows:

export interface IDeflatinatorWebPartProps {
  deflateBeachBalls: boolean;
  deflateBlimps: boolean;
  maxMirrorBounce: number;
  curseYou: string;
}

Your Deflatinator.manifest.json file would include a preconfiguredEntries section that looks like this:

  "preconfiguredEntries": [{
    "groupId": "5c03119e-3074-46fd-976b-c60198311f70",
    "group": { "default": "Other" },
    "title": { "default": "Deflatinator" },
    "officeFabricIconFontName": "Pinned",
    "description": { "default": "Deflates everything within the Tri-state area." },
    "properties": {
      "deflateBeachBalls": true,
      "deflateBlimps": true,
      "maxMirrorBounce": 3,
      "curseYou": "Perry!"
    }
  }]

Every time a user adds your Deflatinator web part, it will have those default values. If you configured your custom properties, your users will be able to customize the values as they wish.

The default values defined in your manifest.json are static -- that is, the default value your users will receive will always be the same unless you change your manifest.json.

But what if you want different pre-configurations to be available to users?

Better yet, what if you want default values that change dynamically, depending on the user's language, permissions, or preferences? How about the SharePoint environment, current date, the content of a list, or anything else?

Luckily, SPFx supports this!

Specifying multiple (but static) pre-configured entries

The first -- and easiest -- way to offer different configurations is to define multiple pre-configured entries in your manifest.json file.

For example, here is my Deflatinator.manifest.json file with two versions of the web part: one that deflates blimps by default (deflateBlimps is true), and one that does not (deflateBlimps is false):

  "preconfiguredEntries": [{
    "groupId": "5c03119e-3074-46fd-976b-c60198311f70",
    "group": { "default": "Other" },
    "title": { "default": "Deflatinator" },
    "officeFabricIconFontName": "Pinned",
    "description": { "default": "Deflates everything within the Tri-state area." },
    "properties": {
      "deflateBeachBalls": true,
      "deflateBlimps": true,
      "maxMirrorBounce": 3,
      "curseYou": "Perry!"
    }
  },
  {
    "groupId": "5c03119e-3074-46fd-976b-c60198311f70",
    "group": { "default": "Other" },
    "title": { "default": "Deflatinator -- No blimps" },
    "officeFabricIconFontName": "Pinned",
    "description": { "default": "Deflates everything except for blimps within the Tri-state area." },
    "properties": {
      "deflateBeachBalls": true,
      "deflateBlimps": false,
      "maxMirrorBounce": 3,
      "curseYou": "Perry!"
    }
  }]

When users open the web part catalog, they will see two entries: Deflatinator and Deflatinator -- No blimps. Depending on which web part entry they choose, the web part will either deflate blimps by default or not.

This is a good approach if you have a web part that can be used in a lot of different ways (like a web part with different views, or an Embed web part that allows you to embed different types of things in a page).

It is also a good way to emphasize different functionality within your web part.

However, it can also lead to over-crowding of your web part catalog. Imagine if we needed one pre-configured Deflatinator web part for every possible first name in the curseYou property?)

Specifying dynamic defaults

Luckily, you can define default properties when the user adds your web part to their page at run-time using the onInit event in your web part code.

During the onInit event, you can set the default properties any way you want.

The only tricky bit is that onInit expects a Promise<void> response -- but don't let that scare you!

Here is some code that sets the same default values as above:

protected onInit(): Promise<void> {
    // create a new promise
    return new Promise<void>((resolve, _reject) => {

        // set a default if Deflate Beach Balls has not been defined
        if (this.properties.deflateBeachBalls === undefined) {
            this.properties.deflateBeachBalls = true;
        }

        // set a default if Deflate Blimps has not beed defined
        if (this.properties.deflateBlimps === undefined) {
            this.properties.deflateBlimps = true;
        }

        // set a default if Mirror Bounce has not beed defined
      if (this.properties.maxMirrorBounce === undefined) {
        this.properties.maxMirrorBounce = 3;
      }

        // set a default if Curse You name hasn't been defined
        if (this.properties.curseYou === undefined) {
            this.properties.curseYou = 'Perry!';
        }

        // resolve the promise
        resolve(undefined);
    });
}

Of course, make sure to update your manifest.json file as follows:

  "preconfiguredEntries": [{
    "groupId": "5c03119e-3074-46fd-976b-c60198311f70",
    "group": { "default": "Other" },
    "title": { "default": "Deflatinator" },
    "officeFabricIconFontName": "Pinned",
    "description": { "default": "Deflates everything within the Tri-state area." },
    "properties": {
    }
  }]

NOTE: If you find that your changes to the manifest.json file dont seem to take effect when debugging your solution, you may need to stop debugging, rungulp bundle`, then restart debugging.

Using localized default values

The code above does exactly the same thing as if you defined default values in your manifest.json. If that's all you need, stick to setting the default values the manifest.json.

Let's try setting the default curseYou property to a localized name:

// assumes that when you created your web part it defined your localized strings
// and that you added a DefaultCurseYouName property
import * as strings from 'DeflatinatorWebPartStrings';
...
protected onInit(): Promise<void> {
    // create a new promise
    return new Promise<void>((resolve, _reject) => {

        // set a default if Deflate Beach Balls has not been defined
        if (this.properties.deflateBeachBalls === undefined) {
            this.properties.deflateBeachBalls = true;
        }

        // set a default if Deflate Blimps has not beed defined
        if (this.properties.deflateBlimps === undefined) {
            this.properties.deflateBlimps = true;
        }

        // set a default if Mirror Bounce has not beed defined
        if (this.properties.maxMirrorBounce === undefined) {
            this.properties.maxMirrorBounce = 3;
        }

        // set a default if Curse You name hasn't been defined
        if (this.properties.curseYou === undefined) {
            // BEGIN CHANGED: use the localized default name
            this.properties.curseYou = strings.DefaultCurseYouName;
            // END CHANGED
        }

        // resolve the promise
        resolve(undefined);
    });
}

Using current date and time

Ok, let's make things a bit more complicated; Let's pretend that your web part has a countdown (to indicate when the Deflatinator will trigger, of course) and that you want to store the triggerTime in a web part property.

You could update your IDeflatinatorWebPartProps to include a triggerTime prop:

export interface IDeflatinatorWebPartProps {
  deflateBeachBalls: boolean;
  deflateBlimps: boolean;
  maxMirrorBounce: number;
  curseYou: string;

  // BEGIN ADDED: Add triggerTime
  triggerTime: Date;
  // END ADDED
}

Now let's pretend that you want the triggerTime to automatically default to one day from when the user adds the web part. You would change your onInit method as follows:

  protected onInit(): Promise<void> {
    // create a new promise
      return new Promise<void>((resolve, _reject) => {

      // set a default if Deflate Beach Balls has not been defined
      if (this.properties.deflateBeachBalls === undefined) {
        this.properties.deflateBeachBalls = true;
      }

      // set a default if Deflate Blimps has not beed defined
      if (this.properties.deflateBlimps === undefined) {
        this.properties.deflateBlimps = true;
      }

      // set a default if Mirror Bounce has not beed defined
      if (this.properties.maxMirrorBounce === undefined) {
        this.properties.maxMirrorBounce = 3;
      }

      // set a default if Curse You name hasn't been defined
      if (this.properties.curseYou === undefined) {
        this.properties.curseYou = strings.DefaultCurseYouName;
      }

      // BEGIN ADDED: set a default Trigger Date
      if (this.properties.triggerTime === undefined) {
        // Get the current date
        const defaultTrigger: Date = new Date();

        // Add one day
        // I know, I know, I could use momentjs, but this is
        // the cheesy way to do it without extra libraries
        defaultTrigger.setDate(defaultTrigger.getDate() + 1);

        // Set the default date
        this.properties.triggerTime = defaultTrigger;
      }
      // END ADDED

      // resolve the promise
      resolve(undefined);
    });
  }

When the user adds your web part, the default triggerTime will automatically calculate tomorrow's date.

NOTE: you'll notice all my code above tests that the property is not undefined before setting the value. It handles cases where there is a default value configured in the manifest.json. It is not necessary, but it doesn't hurt to be extra careful, right?

Using current user information

So far, we've used pretty simple tricks to set default properties to a dynamic value, but what if we wanted to do something a bit more difficult? What if we wanted to use (gasp!) Promises?! (Insert ominous music here)

Let us pretend that -- for whatever reason -- we wanted the web part`s default property to use the name of the user who inserted the web part.

For this, we will use the awesome PnP/PnPjs libraries.

First, start by installing the library to your project by using the instructions from the PnP/PnPjs getting started page:

npm install @pnp/common @pnp/sp @pnp/logging @pnp/odata

NOTE:
YOU: "Hey, everything I have seen -- including the PnP documentation -- says that I need to add --save in my npm install command. You did not do that! Did you forget it?"
ME: No, the --save parameter is no longer required with npm install (see the documentation). It does not hurt if you have it, but it does not do anything anymore ... assuming, of course, that you have a current version of npm.

Then add PnP libraries to your imports at the top of your web part code:

import { sp } from "@pnp/sp";
import { CurrentUser } from '@pnp/sp/src/siteusers';

Then change your onInit as follows:

protected onInit(): Promise<void> {
    // create a new promise
    return new Promise<void>((resolve, _reject) => {
      // set a default if Deflate Beach Balls has not been defined
      if (this.properties.deflateBeachBalls === undefined) {
        this.properties.deflateBeachBalls = true;
      }

      // set a default if Deflate Blimps has not beed defined
      if (this.properties.deflateBlimps === undefined) {
        this.properties.deflateBlimps = true;
      }

      //MOVED: moved the code to set the default Curse You name to the end of this function

      // Set a default Trigger Date
      if (this.properties.triggerTime === undefined) {
        // Get the current date
        const defaultTrigger: Date = new Date();

        // Add one day
        defaultTrigger.setDate(defaultTrigger.getDate() + 1);

        // Set the default date
        this.properties.triggerTime = defaultTrigger;
      }

      // set a default if Mirror Bounce has not beed defined
      if (this.properties.maxMirrorBounce === undefined) {
        this.properties.maxMirrorBounce = 3;
      }

      // BEGIN CHANGED: If there is no one to curse, get the current user
      if (this.properties.curseYou === undefined) {
        // No default value, get the current user's name
        sp.web.currentUser
          .select("Title") // don't retrieve everytyhing, we just want the display name
          .get()
          .then((r: CurrentUser) => {
            // set a default if Curse You name hasn't been defined

            // I always set a default value in case I can't get the current user's name
            let curseYouUser: string = strings.DefaultCurseYouName;

            // If we got user properties
            if (r !== undefined) {
              console.log("Yes to current user", r["Title"]);
              curseYouUser = r["Title"];
            }

            this.properties.curseYou = curseYouUser;

            // resolve the promise when done
            resolve(undefined);
          });
      } else {
        // Resolve the promise
        resolve(undefined);
      }
      // END CHANGED
    });
  }

You could also use the same approach to retrieve data from a SharePoint list, or from an external API.

Bonus benefits

There is an added benefit to set default values in the onInit method: if you are debugging and testing your code, and want to make changes to the default values, you can just change the code in your onInit and your changes will be reflected next time you add the web part to a page.

If you changed your default values in your manifest.json instead, you would need to stop debugging, run gulp bundle, restart debugging, remove the web part, refresh the page, re-add the web part.

For a lazy person like me, it is much easier to change the onInit method. Just keep in mind that there are valid scenarios (like when you need to offer multiple versions of your web part) where it is better to use the manifest.json preconfiguredEntries.

Also, it doesn't need to be a one-size-fits-all scenario: you can combine some entries in the manifest.json with some code in your onInit. That is why my code above always verifies that the value is undefined before I attempt to apply default values.

Just keep in mind the onInit gets called often. You want the code to be as fast and optimized as possible. For example, make sure the value you want to set as default is really empty before you call an API to get a default value.

Conclusion

SPFx allows you to pre-configure default values for your web part custom properties that get applied when a user first adds the web part to a page.

When you want to dynamically set default values, you can override the onInit method to apply any logic you need.

In this article, I used a completely nonsense web part to demonstrate the concepts, but you can apply the same principles in your own (hopefully, less nonsense) web parts.

I hope this helps?

For more information

I hate acronyms.

Should have thought of that before getting into IT for a living!

One of the most annoying acronyms, to me, is CORS. It is annoying because it shows up in an error message when you're trying to make an HTTP request to a URL external to SharePoint.

It may be hard to diagnose if you don't handle your HTTP request rejections, or if you don't have your developer tools enabled in your browser, but when you do, you'll get an error message that looks like this:

workbench.html:1 Access to fetch at &#039;https://somecoolexternalapi.com/api&#039; from origin 
&#039;https://localhost:4321&#039; has been blocked by CORS policy: Response to preflight request doesn&#039;t 
pass access control check: It does not have HTTP ok status.

This article will explain what CORS is, and how to avoid issues with CORS when making HTTP requests to an external resource.

What is CORS?

NOTE: I'm over-simplifying the explanation and definition of CORS. If you want the real definition, go look at Wikipedia. Just don't scream at me for being slightly inaccurate, ok? ūüôā

CORS stands for Cross-origin resource sharing. It is a way to control how stuff from one web sites (like images, CSS, scripts, and even APIs) is shared with other web sites.

When it isn't busy ruining your day, CORS can be useful because it allows you to prevent people from pointing to your web site to steal resources from it (while causing extra traffic). Or worse.

It usually works by looking at the domain where the request originates from (e.g.: mytenant.sharepoint.com) and comparing against the domain where the resource sites (e.g.: mycoolapi.com). If the two domains aren't the same, it is a cross-domain request or -- in CORS terms -- a cross-origin request.

While you can do some CORS validation on the server-side (that's another blog), it is usually enforced by your browser. In fact, the CORS standards request that any requests that potentially change data (like an API call) should be pre-validated by your browser before even requesting the resource. That pre-verification is called preflight.

It goes a little something like this:

CLIENT-SIDE COMPONENT: "Hey browser, please call this API from https://somecoolapi.com
BROWSER: "Sure thing. Lemme ask.". "Hmm, somecoolapi.com is a different domain than mytenant.sharepoint.com, where we are now. I should check first"; calls somecoolapi.com.
WEBSITE: "New domain, who dis?"
BROWSER: "Hey, someone from origin: mytenant.sharepoint.com would like to get access to your API. You can find out all about it in my OPTIONS HTTP headers."
WEBSITE: "Sure, I don't see any reasons why you shouldn't be allowed. Here, let me give you some Access-Control-Allow-Origin headers to confirm I'm ok with it. Just make sure you only GET stuff, no POST or DELETEs, ok?".
WEBSITE: "Awesome!"; Turns to user, "Good news! somecoolapi.com said they'll do it!".
WEBSITE: Makes request. Gets results. Returns results to user.
They lived happily ever after.
The End.

Come to think of it, that's exactly how I handle phone calls; If call display is blocked, or it is a number I don't know, I let it go to voicemail. If it is my wife, I answer. She then asks me to buy more Nespresso coffee on the way home. I usually accept the request, because standing between my wife and coffee is like standing between a mother bear and her cub: dangerous.

So, CORS may be annoying, but it is useful.

The problem is that when you make requests to another domain in a SPFx web part using SPHttpClient, you're making a request from mytenant.sharepoint.com. It usually triggers a CORS error.

To make things worse, when you search for the error, you usually get tons of results on how to change the server settings to prevent the issue. Nothing on how to solve it in your client-side web part.

How to solve CORS issues with SPHttpClient

SPHttpClient, included in strong>@microsoft/sp-http</strong, make it easy to make HTTP requests using the current web part's context.

To access it from your component or service, you need to get the web part's WebPartContext -- I usually pass it into my component's props, like this:

import { WebPartContext } from &quot;@microsoft/sp-webpart-base&quot;;
export interface IMyCustomComponent {
   context: WebPartContext;
}

Once you have the WebPartContext you can make the Http request using SPHttpClient, usually something like this:

import { SPHttpClient, SPHttpClientResponse} from &#039;@microsoft/sp-http&#039;;

…
/* When ready to make request */
return this.props.context.spHttpClient.get(yourApiUrl, SPHttpClient.configuration.v1)
.then((apiResponse: SPHttpClientResponse) =&gt; apiResponse.json()
.then(…) /* Handle the results */

...which is usually when you get the CORS issue.

To avoid the CORS issue, you need to make sure that your request meets the following requirements:

  • No custom HTTP headers such as 'application/xml' or 'application/json'
  • Request method has to be GET, HEAD, or POST.
  • If method is POST, content type should be 'application/x-www-form-urlencoded', 'multipart/form-data', or 'text/plain'

However, SPHttpClient tries to be nice and sets a custom OPTIONS HTTP header for you by default.

In order to override the OPTIONS header in your SPHttpClient request, just pass a new/clean IHttpClientOptions parameter, as follows:

import { SPHttpClient, SPHttpClientResponse, ISPHttpClientOptions } from &#039;@microsoft/sp-http&#039;;

…
/* When ready to make request */
const myOptions: ISPHttpClientOptions = {
      headers: new Headers(),
      method: &quot;GET&quot;,
      mode: &quot;cors&quot;
    };

return this.props.context.spHttpClient.get(yourApiUrl, SPHttpClient.configuration.v1, myOptions)
.then((apiResponse: SPHttpClientResponse) =&gt; apiResponse.json()
.then(…) /* Handle the results */

And that should be it.

Conclusion

CORS can be scary, it can be annoying, but it is a good thing.

You can avoid CORS issues when using SPHttpClient in your SPFx component by passing a ISPHttpClientOptions that doesn't set custom options.

I only covered how to make GET requests in the code above. You can use a similar approach for HEAD and POST requests.

This approach won't always work (for example, if the API you're calling requires custom HTTP headers), but it should solve most other CORS issues.

And if you have any more questions, post a comment, e-mail, or text. Don't call ūüôā

I hope it helps?

Introduction

If you write SPFx web parts or extensions using React, you may have had to assign more than one or more CSS classes to the same element. To do so, you simply list all the CSS class names inside a string, separated by spaces; Like this:

public render(): React.ReactElement<IDemoProps> {
    return (
      <div
        className={"myClass mySelectedClass myEnabledClass"}>
    ...
    </div>);
}

However, if you want to dynamically assign CSS classes, the string gets a bit more complicated.

For example, if I wanted to add a CSS class only if the state of the element is selected, and also have a different CSS class for whether the object is enabled or not, you would combine a whole bunch of conditional operators inside your string.

Something like this:

public render(): React.ReactElement<IDemoProps> {
    const {
        selected,
        enabled } = this.state;

    return (
      <div
        className={"myClass " 
            + selected ? "mySelectedClass "
            : undefined 
            + enabled ? "myEnabledClass"
            : "myDisabledClass"}>
    ...
    </div>);
}

Note that I had to include a space after myClass and mySelectedClass because, if they get concatenated in a string and I forget to include the space, the className attribute will be:

myClassmySelectedClassmyEnabledClass

instead of:

myClass mySelectedClass myEnabledClass

Which is obvious now that I write it, but when it is 3 in the morning and you're trying to figure out why your CSS class isn't working properly, it is a small mistake that can be very annoying.

And if your logic gets even more complicated, your CSS class name concatenation can be pretty unruly.

Luckily, the standards SPFx solution has a built-in helper.

@uifabric/utilities/lib/css

Courtesy of our Office UI Fabric friends, there is a helper function that takes an array of CSS class names and concatenates it for you.

And the best part is: it is already included inside your SPFx solution!

To use it, start by importing the CSS utilities:

import { css } from "@uifabric/utilities/lib/css";

And replace all that concatenation ugliness with a simple call to css, as follows:

public render(): React.ReactElement<IDemoProps> {
    const {
        selected,
        enabled } = this.state;

    return (
      <div
        className={css("myClass", 
            selected &amp;&amp; "mySelectedClass", 
            enabled ? "myEnabledClass" : "myDisabledClass")}>
    ...
    </div>);
}

The class takes care of adding spaces between the classes. For example, the following code:

className={css('a', 'b', 'c')}

will produce:

className={'a b c'}

It also skips the "falsey" values (according to comments in their code). In other words, you can evaluate class names that result in a null, undefined, or false value and it will skip it.

For example the following code:

className={css('a', null, undefined, false, 'b', 'c')}

Will produce:

className={'a b c'}

You can even pass a dictionary of class names, each with a true/false value, and css will concatenate all the class names that are true, as follows:

className={css('a', { b: true, z: false }, 'c')}

Produces:

className={'a b c'}

<strong>But wait! If you order now, you'll also get</strong> the ability to pass serializable objects (objects that have a <strong>toString()</strong> method) -- at no extra charge!

```TypeScript
const myObject = { toString: () => 'b' };
...
className={css('a', myObject, 'c')}

Will result in:

className={'a b c'}

Conclusion

As a self-proclaimed World's Laziest Developer, I tend to avoid extra work at any cost. The css helper function, which is already in your SPFx solution helps avoid writing extra CSS class name concatenation logic by provided something that is versatile, sturdy and -- best of all -- tested!

I know that this isn't an earth-shattering technique or original, but I find myself constantly re-opening old SPFx solutions to remember where that css function is defined. This article may save me some searching in the future... and hopefully, help you as well!

Introduction

Today, I was moving my files to my new Surface Studio 2 (which is an awesome development machine!); All my personal files are synched to OneDrive, except for my Visual Studio and GitHub project files which are -- by default -- stored in c:\users[myuseraccount]\source\repos.

Synching your personal files to OneDrive makes it really easy to work on multiple devices or making sure that you have a backup in case your workstation is stolen, lost, self-destroyed, or abducted by aliens.

Making sure that your project files are also synched ensures that all those prototypes, proofs of concepts, and other code snippets that you never bothered adding to source control are also safe.

This article describes the steps to move your default project location to a folder that can be stored in OneDrive.

Let's make one thing clear: synching your project files to OneDrive does not replace using source control; if you have any production code in your project files, please use source control.

Change the default project directory

  1. In Visual Studio 2017, select the Tools menu, then Options.
  2. In the Options dialog select the Projects and Solutions category, then Locations.
  3. In the Projects location type (or browse to) a folder on your OneDrive where you want your new projects to be created.
  4. Click OK.Project location in Visual Studio 2017

Changing the default Github repo location in Visual Studio

  1. In Visual Studio, make sure you're connected to GitHub.
  2. From the Team Explorer pane, go to Settings.
  3. In the Settings pane, select Global Settings.
  4. In the Global Settings pane, type (or browse to) the folder you want to use in the Default Repository Location.
  5. Click Update.GitHub Settings

Conclusion

The instructions above will default your new Visual Studio projects and repos in a OneDrive folder; they'll get synchronized with OneDrive.

Thanks to Daniel Zikmund for the detailed steps on how to set up the folder in Visual Studio.  Also, Andrew Grant has a great video showing how to do the above steps.

I hope this helps!?

UPDATE: I apologize to Daniel Zikmund, I gave your brother Martin credits. Thanks Martin for letting me know.

Introduction

A while ago, I wrote an article describing how you can inject a custom CSS stylesheet on SharePoint modern pages using an SPFx application extension. The code sample is now part of the SharePoint SP-Dev-Fx-Extensions repository on GitHub.

Since the article, I have been getting tons of e-mails asking all sorts of questions about the solution.

Since SPFx 1.6 was released, I took the opportunity to upgrade the solution to the latest and greatest version of the toolset. You can find the latest code on GitHub, or download the latest SharePoint package.

In this post, I'll (hopefully) answer some questions about how to use it.

Be smart!

You should really use the out-of-the-box customizations features before you resort to injecting custom CSS.

There are a few reasons why you shouldn't inject your own CSS:

  • Microsoft can change the HTML layout, element ids, or CSS classes at any time -- thus breaking your custom CSS.
  • Your customizations may hide or otherwise disable (or interfere with) new features Microsoft may introduce in the future.
  • Your customizations will be unsupported by Microsoft. Don't try to open support tickets (unless you're willing to pay for them, I guess).
  • Although the solution uses SPFx application extensions, the SharePoint/SPFx team will not be able to support your customizations.

That being said, there are valid reasons why you may need to inject custom CSS. Vesa and his team had to give careful consideration before accepting my solution as a code sample.

Here are some sample valid reasons for injecting your own CSS:

  • To meet your corporate branding guidelines (but consider using a custom theme first).
  • To solve unique accessibility requirements (such as importing a custom font to help with cognitive disabilities, such as dyslexia).
  • To solve an showstopping issue (you know, to shut up one of those bosses/clients that say distasteful stuff like: "we'll only use SharePoint Online/Office 365 **if** Microsoft fixes the ugly look and feel and [insert bad idea here]".
  • For limited-time customizations (like fixing an issue while you're waiting for Microsoft to fix it, or making it snow on Christmas Eve).

Ok, maybe the last one isn't such a valid reason.

Steps to inject your own CSS

  1. Download the code and build the solution, or download the pre-built solution.
  2. Go to your tenant's app catalog (usually at https://[yourtenant].sharepoint.com//sites/Apps/AppCatalog/Forms/AllItems.aspx)
  3. Drag and drop the sppkg file from step 1 onto the library (or click Upload and select the file).
  4. When it prompts you Do you trust react-application-injectcss-client-side-solution? select Deploy (provided, of course, that you trust the solution!). If you want the extension to be available on all sites. check Make this solution available to all sites in the organization before you select Deploy. You may have to check-in the file if it is checked-out.
  5. It may take a while for the application extension to show up (I once had to wait overnight for the magical SharePoint elves to deploy the extension).
  6. Meanwhile, create your own CSS file to include your customizations. Name it custom.css (don't worry, I'll show you how to change that default name later).
  7. Upload your custom.css to your root style library (located at https://[yourtenant].sharepoint.com/Style%20Library/Forms/AllItems.aspx). If you have versioning enabled on that library, you may have to check-in the file so that other people can see your custom css. Again, don't worry, I'll show you how to use a different location later.
  8. Your custom CSS should show up!

The most important part of this is that the custom.css is NOT part of the SPFx solution! It is a separate file stored in a publicly-accessible location.

Frequently Asked Questions

It doesn't work!

  • Start by using the default custom.css name, with the default location of https://[yourtenant].sharepoint.com/Style%20Library/Forms/AllItems.aspx. Once it works, we can move/rename the CSS.
  • Use a really obvious CSS to see that the style sheet is getting loaded. Something like:
.ms-compositeHeader-topWrapper {
    margin-top: 5px !Important;
    background-color: green;
}

if the above CSS works (by adding an ugly green bar at the top of the page), it means that the extension works and is able to load the custom CSS. Verify your CSS.

  • Using your browser's developer extensions, check to see if you're getting any kind of HTTP 404 (Not Found) message. If you're getting a 404, your CSS is named wrong or in the wrong place.

It works, but only for me (and other administrators)

  • You probably didn't check-in and publish your CSS.

The CSS doesn't get packaged in my solution!

  • It isn't supposed to be! By default, the CSS is uploaded in the root style library (which can be found at https://[yourtenant].sharepoint.com/Style%20Library/Forms/AllItems.aspx.

Why doesn't the CSS get packaged in the solution?

  • I wanted to avoid having to re-deploy the solution every time I wanted to change the CSS.
  • I wanted non-developers to be able to use the application extension.

How do I change the name of the CSS?

  1. Rename your CSS to whatever you want
  2. Upload it to your root style library
  3. Go to your Tenant Wide Extensions (located at: https://[yourtenant].sharepoint.com/sites/Apps/Lists/TenantWideExtensions/AllItems.aspx
  4. Select the InjectCssApplicationCustomizer from the list.
  5. Select Edit Item from the ribbon.
  6. In the edit form, change the value in Component Properties to use your new CSS name and hit Save. For example, if you renamed your CSS to contoso.css, you'd change the entry to be:
{
    "cssurl":"/Style%20Library/<strong>contoso</strong>.css"
}

How do I place the CSS somewhere else than the root style library?

  1. Place your CSS in a publicly accessible library
  2. Go to your Tenant Wide Extensions (located at: https://[yourtenant].sharepoint.com/sites/Apps/Lists/TenantWideExtensions/AllItems.aspx
  3. Select the InjectCssApplicationCustomizer from the list.
  4. Select Edit Item from the ribbon.
  5. In the edit form, change the value in Component Properties to use your new CSS name and hit Save. For example, if you created a new style library called InjectCss in the root site, you'd change the entry to be:
{
    "cssurl":"/<strong>InjectCSS</strong>/custom.css"
}

How do I place the CSS in a CDN?

  • I didn't test it. but in theory, you could follow the instructions above, but change the cssurl value to include the full path to your CDN.

How do I do [insert your own customization] by injecting CSS?

I'm not a CSS expert, but here's how I usually do my customizations:

  1. Using your browser, surf to a modern page.
  2. Launch your browser's developer toolbar (CTRL-Shift-I for Chrome, F12 for Edge)
  3. Use the element selector (CTRL-Shift-C for Chrome, Ctrl-B for Edge) select the element you want to customize.
  4. From the Styles pane in the developer tools, select + (New Style Rule) and enter the styles you want to change. Both Chrome and Edge has autocomplete capabilities, so feel free to explore. Don't worry, it only changes your current page, and does not gets saved if you refresh the page or load a new page.
  5. If you find that your styles are getting overwritten as soon as you apply them, try adding an !important instruction at the end of your style. (CSS experts are cringing as they read this).
  6. Once your element looks the way you want it, copy the rule to your custom CSS and upload the CSS wherever your placed it in your tenant.

Did I forget anything?

If there is anything I forgot, please let me know in the comments. I'll try to answer every question... eventually.

Introduction

A week ago, Microsoft officially released the SharePoint Framework Package v1.5, introducing new awesome features like the Developer Preview of Dynamic Data and the ability to create solutions with beta features by adding --plusbeta to the Yeoman command -- among other features.

While it isn't necessary to update your existing SPFx solutions, you may need to do so (let's say, because you have an existing solution that needs a feature only available in SPFx 1.5, for example).

Unfortunately, the solution upgrade process between versions of SPFx is often painful.

Thankfully, there is an easy way to do this now!

This article explain a (mostly) pain-free to upgrade your SPFx solution. Waldek explains this process in details, but this is a summary of how to do it.

Office 365 CLI

 

Office 365 CLI is a cross-platform command-line interface (at least, that's what I think CLI means... I hate acronyms) that allows you to do a lot of things with your Office 365 subscription, on pretty-much any operating system you want to do. (Find out more about Office 365 CLI).

The Office 365 CLI version 1.4.0-beta version introduced a new spfx project upgrade function; It can be used to upgrade an SPFx project.

If you don't have Office 365 CLI version 1.4.0-beta or above, you'll need to install it first.  To do so, run the following command:

npm i -g @pnp/office365-cli@next

Analyzing your project

The spfx project upgrade function does not change your project -- you'll need to do this yourself; it analyzes your project and gives you a report telling you exactly what you need to do.

Sample upgrade report
Sample upgrade report

To use it, follow these steps:

  • From your command-line, change your current directory to the root of your SPFx project.
  • Type the following command:
o365 spfx project upgrade --output md > report.md

Once analysis is completed, open the report.md file that was created in your SPFx project folder.

Upgrading your project

If you really want to all the required changes that the analysis found, you can read the report, but if you're in a hurry, follow these steps:

  • Back-up your project (do I really need to say this?)
  • Scroll to the (almost) end of the¬†report.md¬†file and look for the¬†Summary section.
  • Copy the code block under the¬†Execute script¬†header and paste it into your console.

    The Summary Execute Script section
    The Summary Execute script section
  • Next, find every file in the¬†Modify files¬†section and make the highlighted changes. Pro tip: the report provides a hyperlink to each file that you need to change. Just use¬†CTRL-Click¬†to open the file.

    The Modify files section
    The Modify files section

Note that you may have multiple updates to make to the same file, but the report will list each update as a separate entry.  The report also pretends that there is nothing else in the file than what it shows in the report. So, for example, if your .yo-rc,json file looks like this before the upgrade:

{
    "@microsoft/generator-sharepoint": {
        "version": "1.4.1",
        "libraryName": "react-calendar-feed",
        "libraryId": "dd42aa00-b07d-48a2-8896-cc2f8c0d3fae",
        "environment": "spo"
     }
}

and the upgrade report tells you to update .yo-rc.json as follows:

{
    "@microsoft/generator-sharepoint": {
        "version": "1.5.0"
    }
}

You're really supposed to update the .yo-rc.json as follows (change highlighted in bold):

{
    "@microsoft/generator-sharepoint": {
        "version": "1.5.0",
        "libraryName": "react-calendar-feed",
        "libraryId": "dd42aa00-b07d-48a2-8896-cc2f8c0d3fae",
        "environment": "spo"
     }
}

But the next sections in the report will include more changes to the .yo-rc.json file, which -- when you've made all the changes -- will look like this:

{
    "@microsoft/generator-sharepoint": {
        "version": "1.5.0",
        "libraryName": "react-calendar-feed",
        "libraryId": "dd42aa00-b07d-48a2-8896-cc2f8c0d3fae",
        "environment": "spo",
        "isCreatingSolution": true,
        "packageManager": "npm",
        "componentType": "webpart"
     }
}

Once you've made all your changes, test your solution and (hopefully) it will work with SPFx 1.5!

Conclusion

You shouldn't need to upgrade your solution every single time Microsoft releases a new version of SPFx.

If you have to upgrade your solution, however, the Office 365 CLI spfx upgrade project command can save you a lot of time.

For more information

This article is mostly a note to myself on how to upgrade an SPFx project. For the real deal, I encourage you to read Waldek's detailed article, from where I learned about the spfx project upgrade command. (Thanks Waldek for being awesome!)

To learn more about Office 365 CLI, go to https://aka.ms/o365cli

To learn more about the cool new features available in SPFx 1.5, go to the Release Notes for SharePoint Framework Package 1.5.

The solution I used in this article is my React Calendar Feed sample web part, available on the SharePoint Framework Client-Side Web Part Samples & Tutorial Materials.

Introduction

In Part 1 of this article, I walked through the various components that we'll need to build to create a responsive calendar feed web part that mimics the out-of-the-box SharePoint events web part.

In this article, we'll:

  • Create a web part solution
  • Add a mock service to return test events, and
  • We'll display a simple list of events

The final product will look like this:

CalendarFeedPart1

Creating a web part solution

If you haven't done so yet, set up your SharePoint Framework development environment following Microsoft's awesome instructions.

We'll create a solution called react-calendar-feed-1. In future articles, we'll take what we built in this article as the foundation for react-calendar-feed-2, and so on until we're done with the solution, which we'll call react-calendar-feed. Of course, you can skip all the steps and get the code for the final solution, if you'd like.

When you're ready to create the solution, use the following steps:

  • Using the command line, create a new project directory
md react-calendar-feed-1
  • Change the current directory to your new project directory
cd react-calendar-feed-1
  • Launch the Yeoman SharePoint Generator:
yo @Microsoft/sharepoint
  • When prompted for the¬†solution name, accept the default¬†react-calendar-feed-1.
  • For the¬†baseline package¬†select¬†SharePoint Online only (latest).
  • When asked¬†Where do you want to place the files?¬†accept the default¬†Use the current folder.
  • When asked if you want to¬†allow the tenant admin the choice of being able to deploy the solution to all sites immediately¬†respond¬†No.
  • When asked for the¬†type of client-side component to create¬†select¬†WebPart.
  • For¬†Web part name, use¬†CalendarFeedSummary. Later, we're planning on adding other web parts for searching events (but that's another blog post).
  • For¬†Web part description, enter¬†Displays events from an external feed.
  • When asked for a¬†framework select¬†React.
  • What Yeoman is done creating the project for you, it‚Äôll say¬†Congratulations! Solution react-calendar-feed-1 is created. Run gulp serve to play with it!.
  • Since we're not quite ready to play with the web part yet, let's launch Visual Studio Code by typing:
    code .
  • Once Visual Studio Code is launched, we're ready to code!

Cleaning up the generated web part code

If you open the CalendarFeedWebPart.ts file, you'll notice that there are multiple exports : one for ICalendarFeedSummaryWebPartProps and one for CalendarFeedSummaryWebPart.

One practice that I've learned by reading the Office UI Fabric code is they keep the component code separate from the Prop and State interfaces in separate files, making each component file simpler and easier to read. This is a practice I tend to follow as well, so let's create a separate file for the web part's types:

  • In the¬†src | webparts | calendarFeedSummary¬†folder, create a new file called¬†CalendarFeedSummaryWebPart.types.ts.
  • Back in the¬†CalendarFeedSummaryWebPart.ts, find the¬†export interface ICalendarFeedSummaryWebPartProps¬†block and cut it.
  • Go back to CalendarFeedSummaryWebPart.types.ts and paste the code you just cut. The file should look as follows:
  • Back in¬†CalendarFeedSummaryWebPart.ts, you'll want to add an import to the interface we just moved out. At the top of the file, just below the last¬†import¬†line, type the following:
    import { ICalendarFeedSummaryWebPartProps } from './CalendarFeedSummaryWebPart.types';

Creating shared services

Create the folder structure

When the final solution will be completed, our web part will consume calendar feeds from various services. Those services will also be re-usable by other web parts.

We'll start by create a single mock service that will return events in the format that we need. In future posts, we'll add more types of services.

  • In the¬†src¬†folder, create a new folder called¬†shared. This is where all shared components will reside.
  • In the newly created¬†shared¬†folder, create a new folder called¬†services. This is where all services will go. Even if we'll only have one type of service. it is a good idea to adopt a consistent folder structure.
  • In the¬†services¬†folder, create a folder called¬†CalendarService.

Create an ICalendarEvent interface

Our calendar service providers will return a bunch of events that will all have the same properties:

  • Title: the title of the event
  • Start: the start date and time of the event
  • End: the end date and time
  • URL: the URL for the event, if applicable
  • AllDay: a boolean (true or false) value indicating if the event is an all-day event (i.e.: with no start and end time).
  • Category: a classification for the event, if applicable.
  • Description: a short text summary of the event, if available.
  • Location: a physical location for the event, if applicable.

Why "if applicable"? Not all event providers are capable of returning all properties for events.

To make it easier to work with, we'll create an ICalendarEvent that will expose all the above properties. Why an interface and not a class? Well, in Typescript, an interface is the easiest way to describe a type of thing without actually saying what the thing does or how it does things.

If our events needed to do things, like calculate their own duration (end date minus start date) or something of the sort, we'd need a class to implement the method; our ICalendarEvent interface is really a convenient way to describe that all events have a title, a start date, end date, etc.

To create the ICalendarEvent interface:

  • In the¬†src | shared | services | CalendarService¬†folder, create a new file called¬†ICalendarEvent.ts
  • Copy the code below and paste it in the new file you created:

Some may argue that the ICalendarEvent is really a model and it should really reside in a different folder where all models go, but I like the simplicity of the CalendarService folder holding everything it needs to deliver a calendar feed. If I ever wanted to move it out to its own module, I could do it very simply.

Create a service interface

We'll first create an interface that all calendar service providers will implement. Again, the interface will describe what the calendar service providers will look like. Later, we'll create a calendar service provider class that will implement the interface.

But for now, let's create the interface:

  • In the¬†src | shared | services | CalendarService¬†folder, create a new file called¬†ICalendarService.ts.
  • Create another file called¬†index.ts in the¬†CalendarService¬†folder.
  • Paste the following code in each respective file

As you'll see, the ICalendarService interface says that all calendar service providers will need to implement a getEvents method that will return a promise of an array of ICalendarEvent. We return a promise because we'll (usually) be retrieving events from calendar service providers asynchronously, and promises make it easier to do that.

Don't worry, we'll explain this better when we implement our first real calendar service provider.

You'll notice that we create a index.ts in the root of the CalendarService folder and exported both the ICalendarService and the ICalendarEvent interfaces. Why? Just like index.html used to be the default web page for a site, index.ts is the default file for a folder in Typescript. If you don't specify a file when using an import, it automatically looks for the default file.

But why would I create an index.ts file? Isn't just an extra file that I'll need to maintain? Yes, but it makes it easier to hide the complexities of the CalendarService to the rest of the application -- they just need to know that they need an ICalendarService and an ICalendarEvent interface from the CalendarService folder, without needing to know where (in which specific file) the interfaces are implemented. When we start adding new service providers, or when we move stuff around, we won't have to change our imports because we'll always point to the default index.ts for the CalendarService.

Don't worry, it'll make sense very soon.

Creating the mock service provider

Now that we have an ICalendarEvent interface to represent events, and an ICalendarService to represent a service provider, let's combine the two and return some sample events.

Instead of created events with hard-coded dates that will become obsolete as time goes by, we'll create events with dates that are dynamically generated when the web part is displayed. To make our lives easier, we'll use Moment.js to manipulate dates throughout this project. Moment.js makes it easy to manipulate dates and format them into human-readable formats.

  • From Visual Studio Code's Integrated Terminal¬†(CTRL-`) type
    npm install moment
  • In the¬†src |¬†shared |¬†services¬†folder, create a new folder called¬†MockCalendarService.
  • In the new folder, create a new file called¬†MockCalendarService.ts, then create another file called¬†index.ts.
  • Copy and paste the content from the files below into the respective files below.

The MockCalendarService creates a series of events that range from tomorrow to 18 months from now. Some are only 1 day long, but some events last a few days.

The getEvents method in MockCalendarService simulates the delay of getting the events through an HTTP request and returns the list of pre-generated events.  In a later article, we'll actually get real events, but -- for now -- this should do to test our rendering.

Rendering events

Although our goal is to render calendar events that look exactly like what SharePoint does, we'll begin by rendering a list of events as bullet points. This is to ensure that our code works, and to allow us to finish this article with something that works before we explore rendering.

  • Find the¬†CalendarFeedSummary.tsx¬†file (located under¬†src¬†|¬†webparts¬†| components | calendarFeedSummary¬†)
  • Above the¬†render¬†function, add a new public function called¬†componentDidMount¬†which calls this._loadEvents() (we'll create the _loadEvents function shortly). The code should look as follows:
public componentDidMount(): void {
    this._loadEvents();
 }
  • Below the¬†render¬†function (I like to keep my public functions separate from my private functions), add a private function called¬†_loadEvents(). The code will look as follows:
private _loadEvents(): void {
    const dataProvider: ICalendarService = new MockCalendarService();
    if (dataProvider) {
      this.setState(
        {
          isLoading: true
        });
      dataProvider.getEvents()
        .then((events: ICalendarEvent[]) => {
          this.setState({
            isLoading: false,
            events: events
          });
        }).catch((error: any) => {
          console.log("Exception returned by getEvents", error.message);
          this.setState({
            isLoading: false,
            events: []
          });
        });
    }
  }
  • You'll notice that we're referring to¬†isLoading¬†and¬†events¬†state variables, but we haven't defined them. Let's fix that by going to¬†CalendarFeedSummaryProps.ts and adding a new interface called¬†ICalendarFeedSummaryState¬†, as follows:
export interface ICalendarFeedSummaryState {
    isLoading: boolean;
    events: ICalendarEvent[];
        }
  • And, at the top of the same file, add a reference to¬†ICalendarEvent as follows:
import { ICalendarEvent } from "../../../shared/services/CalendarService";
  • Since the file no longer contain only the CalendarFeedSummaryProps, rename the file from¬†CalendarFeedSummaryProps.ts¬†to¬†CalendarFeedSummary.types.ts.
  • Back in¬†CalendarFeedSummary, find the following line:
export default class CalendarFeedSummary extends React.Component<ICalendarFeedSummaryProps, {}> {
  • And replace it with:
export default class CalendarFeedSummary extends React.Component<ICalendarFeedSummaryProps, ICalendarFeedSummaryState> {
  • Essentially telling the¬†CalendarFeedSummary¬†component to use the¬†ICalendarFeedSummaryProps¬†interface for its properties, and¬†ICalendarFeedSummaryState¬†interface for its state.
  • Make sure to update the existing reference to¬†ICalendarFeedSummaryProps¬† and to include a reference to¬†ICalendarFeedSummaryState¬†by changing the following import statement at the top of the file:
import { ICalendarFeedSummaryProps } from './ICalendarFeedSummaryProps';

with:

import { ICalendarFeedSummaryProps, ICalendarFeedSummaryState } from './CalendarFeedSummary.types';
  • Since we no longer use an empty state, we need to initalize it with a constructor. At the top of the¬†CalendarFeedSummary.tsx¬†file, just above the¬†componentDidMount¬†function,¬†add the following code:
constructor(props: ICalendarFeedSummaryProps) {
    super(props);
    this.state = {
        isLoading: false,
        events: [],
    };
        }
  • In the¬†render method, remove the ¬†with a className¬†styles.container¬†and all of its content. You'll be left with something that looks like this:
public render(): React.ReactElement<ICalendarFeedSummaryProps> {
    return (

); }

  • Inside the blank¬†div¬†in the¬†render¬†function, add some code that will render the events as a bulleted list, by adding the following code:
    { this.state.events.map(e=>{ return

  • {e.title}&lt/li>; })}

 

The final code should look as follows:

 

When you're ready to test the web part, type:

gulp serve

and add the web part we created to the page. The events will render as a list:

CalendarFeedPart1

Conclusion

Although it isn't very exciting (yet), the web part we created creates a bunch of events, simulates retrieving them from an HTTP request and renders them in a list.

In our next article, we'll render the events so that they look like SharePoint events.

Introduction

Last week, I attended the SharePoint 2018 Conference in Las Vegas. There were a lot of cool announcements and demos. The SharePoint team rocks!

One of the cool things that I noticed which has nothing to do with SharePoint was that a lot of presenters who showed code had a really cool command prompt that showed the node module they were in, and their Git branch status in a pretty "boat chart".

Console showing node module version and git branching information

I had seen this many times before, but never realized how much easier it was to get a sense of what's going on until I was watching someone else code on a big screen.

Of course, I set out to find and configure this awesome command-line on my workstation.

This article will show you how you too can install and configure this command line interface.

Cmder

During Vesa's awesome session, I paid close attention to the title of his command line window. It said Cmder.

I had seen Cmder before; the article Set up your SPFx development environment mentions Cmder in the Optional Tools section.

But the version of Cmder I had installed didn't have the fancy "boat chart" at the top that got my attention.

As it turns out, you need to download another custom prompt for Cmder that adds the Powerline (that's the real name for the "boat chart") at the top.

Here is how to install and configure Cmder with the Powerline command prompt:

Installing Cmder

  1. Go to http://cmder.net/ and download either the Mini pack or the Full pack.
  2. Unzip the package. Cmder is designed to be portable and to require no administrative privileges to run, so their instructions tell you to not install it in the Program Files folder (where you'll need administrative privileges). I placed it in C:\Users\[myusername]\AppData\Local\cmder.
  3. Open a command prompt in Administrative mode from the folder where you copied the Cmder files
  4. From the command-prompt, type:
    cmder /REGISTER ALL
  5. If you get an Access Denied error, you probably forgot to run the command in Administrative mode. If you don't know how to do that, type cmd from your Start menu, and right-click on Command Prompt and select Run as administrator.
  6. Cmder should be installed. You can verify by opening a new File Explorer window and right-clicking on a folder. You should get a Cmder Here option.
    Cmder Here

Unfortunately, if you open Cmder with that command line, you don't get the fancy Powerline.

Let's fix that!

Installing Cmder Powerline custom prompt

The Cmder Powerline custom prompt changes the Cmder prompt to include the following modifications:

  • The folder portion of the prompt is displayed in blue. The user's home folder is also replaced with a tilde (~).
  • If the current folder is an npm package, the prompt will display the package package name and version number in teal.
  • If the current folder is a Git repository, the prompt will display the branch name with a green colour if the branch is unchanged, or yellow if changes are found.

To install the Cmder Powerline custom prompt:

  1. Download the AnonymousPro font. You can do so by clicking on each TTF file in GitHub and selecting View Raw. For your convenience, here are the links to the raw files:
    Anonymice Powerline Bold Italic.ttf
    Anonymice Powerline Bold.ttf
    Anonymice Powerline Italic.ttf
    Anonymice Powerline.ttf
  2. Once dowloaded each font, install them by double-clicking them and selecting Install on each one of them.
  3. Copy all the .lua files from the Cmder Powerline source and place them in the config folder under the Cmder install folder.
  4. If you haven't done so yet, launch a Cmder window by going to the folder where you installed in and double-clicking on Cmder.exe 
  5. From the Cmder window, open the Settings by hitting Windows-Alt-P.
  6. From the Main settings area, select Anonymice Powerline font from the Alternative font (pseudographics, CJK, etc.) drop-down.
  7. In the Unicode ranges combo box, type E0A0-E0B0 and select Apply.
  8. Select Save settings to save your settings and return to the command prompt in Cmder.

CmderSettings

That's all you need to do.

Cmder with Visual Studio Code

If you want Cmder to show up in Visual Studio Code, follow these steps:

  1. Launch Visual Studio Code.
  2. From the File menu, select Preferences | Settings or use Ctrl-, (Control and comma). This will open your settings editor.
  3. In the right-pane of the settings editor (the one that's actually editable), insert the following JSON, just before the last } , making sure to replace the path to Cmder with the path where you installed it.
    "terminal.external.windowsExec": "C:\\Users\\[myusername]\\AppData\\Local\\cmder\\Cmder.exe",
    "terminal.integrated.shell.windows": "cmd.exe",
    "terminal.integrated.shellArgs.windows" : [
    "/K",
    "C:\\Users\\[myusername]\\AppData\\Local\\cmder\\vendor\\init.bat"
    ],

That's all!

Conclusion

I hope that you'll find Cmder and the custom Cmder Powerline command-prompt useful in your SPFx development endeavors.

I know I did!

For More Information

Cmder.net lists more information about Cmder, including the super-powerful shortcut keys.

Amr Eldib is the brilliant mind behind the Cmder Powerline command-prompt.

Sahil Malik has detailed instructions (and a video!) to to integrate with Cmder Visual Studio Code.

Update

In the previous revision of this article, I had forgotten to include the steps to copy the .lua files to the config folder. It works much better when you include¬†all¬†the steps, it turns out ūüôā

 

Introduction

One of the premises of SPFx is that, with it, third-party developers have the same set of tools that the SharePoint team has. So, if you like the look of an out-of-the-box web part you can, in theory, reproduce the same look and feel yourself.

A friend of mine needed to display a list of upcoming events, but the events are coming from a WordPress site that uses the WP Fullcalendar widget. They also really liked the look of events in SharePoint.

So, I thought: why not try re-creating the out-of-the-box SharePoint events web part, but instead of reading events from a SharePoint list (or group calendar), it would read from WordPress?

Since I was taking the challenge, I decided to also try to do these extra features:

  • Read events from multiple event providers, including RSS, iCal, and WordPress.
  • Support additional event providers without having to re-design the entire web part
  • Make the web part responsive, just like the SharePoint events web part, with a narrow view and a wide view.
  • Support "Add to my calendar"
  • Make it possible to add more web parts, for example, the Event Search web part, reusing as many of the components as possible.

This article will explain the various components of this web part. Because I tend to ramble on and on, I'll then explain how to write every component of the web part in separate articles so that you can read as much (or as little) as you want.

And if you really don't want to read the articles, you can always get the code. I won't be offended if you do.

The Web Part

Configuration

If you download the web part and run

gulp serve

you'll see the web part in your web part catalog.

Adding the web part

Note: when I designed this web part, I created an SVG icon for it. At the time of this writing, there was an issue with using custom base64-encoded SVG icons. If your icon doesn't look like the one in the picture above, don't worry.

When you add the web part, you'll be prompted to configure it:

Configure event feed

Selecting the Configure button (or selecting Edit web part in the web part's "toolbox") will launch the web part's property pane.

The web part's property pane

In the property pane, the Feed type drop-down lists all the service providers that the web part can find.

feedtype

The idea is that if we add more feed types, they'll automatically show up here. Let me know in the comments if you have an idea for a feed type you think we should add, or if you'd like to add one yourself just submit a pull request.

If you're running the web part in a development environment, it'll offer you a Mock option, which will add bogus events for testing purposes. In production, this option will not appear.

The Feed URL input box will prompt you to enter a URL for the feed you wish to display. It validates the URL format (but doesn't yet check the URL for results).

FeedUrl

Because the WordPress feed URL that I was using supports a from and to date value in the URL, I added the ability to automatically insert today's date and an end date (see below). All you have to do is to add a {s} where you want the start date and {e} where you want the end date.

The Date range drop-down allows you to select anything from Next week to Next year.

DateRange

Unlike the out-of-the-box SharePoint events search, I didn't add a All events option because there was no way (that I know of) in React to find the maximum possible date. I could have passed a null value around, but I didn't want to do that. If there are enough requests for it, I'll figure out a way to do All events later.

The only event provider that I know of which actually supports specifying a start and end date is WordPress. When a provider doesn't support filtering at the source, I just filter them out once I have received the events.

In the Advanced section, you can specify the Maximum number of events per page for the narrow view (the normal view just fits in as many events as it can on every page).

MaxPageSize

The default is 4 (that's what SharePoint events does), but you can put as many as you want on every page. You can also put 0 if you don't want pagination for the narrow view.

When I was testing this web part, I kept on getting all sorts of CORS issues on some of the feeds I was using. So I added a Use proxy option, which -- you guessed it -- routes your requests through a proxy.

UseProxy

Finally, the web part can use the user's local storage to cache events it retrieves so that the web part doesn't fetch every. single. time. you. resize. the. page.

CacheDuration

You can set the cache duration from 0 to 1440 minutes (1 day) in 15 minute increments. Be careful, though, because it'll always cache a user's results from the time they last retrieved the events. So, if you set it to cache for a day, it'll wait an entire day before reloading events again no matter the time of the day. You should probably set it to half-a-day, just to be safe.

If you don't want to cache, you can set the cache duration to 0 and it'll refresh from the source every time. If your feed is slow, the web part will take forever to load every time.

The Apply button is just to make sure that the web part won't try to load the feed as you type the URL.

Assuming you configured the web part (and that my code works well), you'll get to see your events in a pretty calendar view soon enough.

The narrow view

When you put the web part in a single-column, or when the web part is less than 480 pixels wide, the web part renders a list view of events.

NarrowView.png

The list will render all the events retrieved and paginate the results according to the page size option you configured.

The dates are rendered to look like a page-a-day calendar.

DateBox

If the event spans over multiple days, the date box will render differently:

MultiDayDateBox

The pagination component renders a Previous and Next button, and helps manage how many pages to render, which page to render, etc. Unfortunately, Office UI Fabric doesn't offer a pagination control so I had to write my own.

Of course, if I wasn't so lazy, I would have created a full pagination control with page numbers, and all, but the SharePoint events web part doesn't show the page numbers so I didn't do it. If there is enough demand for it, I'll make the component more generic and add the page numbers.

The Normal view (or carousel view)

When you view the web part on a full page (or when it is wider than 480 pixels), the web part switches to a carousel view.

Carousel View

The carousel view is responsive and renders between 1 and 4 events per page.

Like the SharePoint events web part, there is a next and previous arrow when you mouse over the calendar, with dots at the bottom to indicate what page you're on.

CarouselNav

Finally, the Add to my calendar button creates a dynamic ICS file, allowing you to import the event to most calendars on most devices.

Conclusion

In upcoming articles, I'll show how to build this, component by component.

I hope that you'll enjoy it.

Why would you want to inject CSS?

Since Microsoft introduced Modern Pages to Office 365 and SharePoint, it is really easy to create beautiful sites and pages without requiring any design experience.

If you need to customize the look and feel of modern pages, you can use custom tenant branding, custom site designs, and modern site themes without incurring the wrath of the SharePoint gods.

If you want to go even further, you can use SharePoint Framework Extensions and page placeholders to customize well-known areas of modern pages. Right now, those well-known locations are limited to the top and bottom of the page, but I suspect that in a few weeks, we'll find out that there are more placeholder locations coming.

But what happens when your company has a very strict branding guideline that requires very specific changes to every page? When your customization needs go beyond what's supported in themes? When you need to tweak outside of those well-known locations?

Or, what if you're building a student portal on Office 365 and you need to inject a custom font in a page that is specifically designed to help users with dyslexia?

That's when I would use a custom CSS.

Here be dragons!

Before you go nuts and start customizing SharePoint pages with crazy CSS customizations, we need to set one thing straight:

With SharePoint, you should always colour within the lines. Don't do anything that isn't supported, ever. If you do, and you run into issues, you're on your own.

A badly coloured version of the SharePoint logo.
With SharePoint, you should always colour within the lines

Remember that Microsoft is constantly adding new features to SharePoint. The customizations you make with injecting custom CSS may stop working if the structure of pages change.

What's worse, you could make changes to a page that prevents new features from appearing on your tenant because you're inadvertently hiding elements that are needed for new features.

With custom CSS (and a CSS zen master), you can pretty much do anything you want. The question you should ask yourself is not whether you can do it, but whether it is the right thing to do.

Enough warnings! How do I inject custom CSS?

It is very easy. In fact, I'm probably spending more time explaining how to do it than it took me to write the code for this. If you don't care about how it works, feel free to download the source and install it.

Using SharePoint Framework Extensions, you can write code that you can attach to any Site, Web, or Lists. You can control the scope by how you register your extensions in your SharePoint tenant.

With an extension, you can insert tags in the HTML Head element.

I know what you're thinking: we can just insert a STYLE block at in the HEAD element and insert your own CSS. Sure, but what happens when you need to change your CSS? Re-build and re-deploy your extension? Nah!

Instead, how about inserting a LINK tag and point to a custom CSS that's located in a shared location? That way, you can modify the custom CSS in one place.

You can even have more than one custom CSS and use your extension properties to specify the URL to your custom CSS. In fact, you can add more than one extension on a site to combine multiple custom CSS together to suit your needs.

Building your custom CSS injection extension

You too can design a beautiful SharePoint site that looks like this:

sampleresults
I'm really a better designer than this. I just wanted a screen shot that smacks you in the face with a bright red bar and a custom round site icon. It hurts my eyes.
  1. Start by creating your own custom CSS (something better than I did, please). For example, the above look was achieved with the following CSS:
    .ms-compositeHeader {
        background-color: red;
    }
    .ms-siteLogoContainerOuter {
        border-radius: 50%;
        border-width: 3px;
    }
    .ms-siteLogo-actual {
        border-radius: 50%;
    }
  2. Save your custom CSS to a shared location on your SharePoint tenant. For example, you could save it in the Styles Library of your root site collection. You could also add it to your own Office 365 CDN. Make note of the URL to your CSS for later. For example, if you saved your custom CSS as contoso.css in the Styles Library of your tenant contoso.sharepoint.com, your CSS URL will be:
https://contoso.sharepoint.com/Style%20Library/contoso.css

which can be simplified to:

/Style%20Library/custom.css
  1. Create an SPFx extension following the instructions provided in the Build your first SharePoint Framework Extension (Hello World part 1) article. (Hey, why improve what's already perfect?).
  2. Change the props interface that was created for your ApplicationCustomizer class and replace the description property to cssurl. For example, my ApplicationCustomer class is called InjectCssApplicationCustomizer so my props interface is going to be called IInjectCssApplicationCustomizerProperties. Like this:
  1. Change your onInit method to insert a LINK element pointing to your cssurl property.
  1. In your serve.json located in the config folder, change the pageUrl to connect to a page on your tenant. Also change the cssurl property to pass the URL to the custom CSS you created in steps 1-2, as follows:
    1. Test that your extension works by running gulp serve. When prompted to allow debug scripts, select Load debug scripts.

DebugScriptWarning

You can now tweak your custom CSS to suit your needs, continuing to hit refresh until you're happy with the results.

Deploying to your production tenant

When ready to deploy, you need to bundle your solution, upload it to the app catalog, and enable the extension on every site you want to customize.

To make things easy, you can add an elements.xml file in your SharePoint folder and pre-configure your custom CSS URL. Here's how:

  1. In your solution's sharepoint/assets folder, create a new file called elements.xml. If you don't have a sharepoint folder or assets sub-folder, create them.
  2. Paste the code below in your elements.xml:
  1. Make sure to replace the custom action Title, ClientSideComponentId to match your own extension. You can find those values in your InjectCssApplicationCustomizer.manifest.json, under id and alias.
  2. Change the ClientSideComponentProperties to point to your CSS URL. Pay attention to URL encode the values (e.g.: a space becomes %20).
  3. Run gulp bundle --ship to bundle your solution/
  4. Run gulp package-solution --ship
  5. Drag and drop the .sppkg file that was created in your sharepoint/solution folder to your tenant's app catalog.

If you selected to automatically deploy to all site collections when building the extension, you're done. If not, you'll need to go to every site and add the extension by using the Site Contents and Add an App links.

Conclusion

You can easily inject custom CSS in every modern page of your SharePoint tenant by using an SPFx extension, but be careful. With great CSS power comes great SharePoint responsibility.

You can get the code for this extension at https://github.com/hugoabernier/react-application-injectcss

I'd love to see what you're doing with your custom CSS. Let me know in the comments what you have done, and -- if you're interested -- share the CSS.

I hope this helps?

In part 1 of this article, I introduced the concept for an SPFx extension that adds a header to every page, showing the classification information for a site.

In part 2, we created an SPFx extension that adds a header that displays a static message with the security classification of a site.

In part 3, we learned more about property bags and learned a few ways to set the sc_BusinessImpact property (a property we made up) of our test sites to LBI, MBI, and HBI.

In part 4, we wrote the extension that reads from a site's property bags and displays the classification in the header.

In this part, we will clean up a few things, package and deploy the extension.

Preparing to deploy to production

The extension we wrote in parts 1-4 of this article works, but it isn't really production ready.

First, we'll want to change the code to only display the extension if a web can find a site's information security classification in its property bag. That way, if you chose to deploy the extension to production, you won't have to worry about affecting sites that do not have a security classification (although, it is recommended that every site has a classification, even if it is LBI by default).

Second, we'll change the hard-coded hyperlink to point to a page on your tenant that provides handling instructions for each security classification.

Then we'll remove all those hard-coded strings and replace them with localized strings.

Let's get started!

Conditionally display the extension

So far, our code assumes that every site has a security classification -- which is the right thing to do if you want to be compliant.

However, there are cases where you may want to deploy this extension in production and not display a security classification until you've actually applied a classification to a site.

To do this, we'll change our code a little bit.

  1. In ClassificationHeader.types.ts, we'll change the default classification to be undefined. So, we're changing this line:
    export const DefaultClassification: string = "LBI";
    

    to this line:

    export const DefaultClassification: string = undefined;
    
  2. Now let's change the render method in ClassificationHeader.tsx to handle an undefined value and skip rendering if there is no security classification. Change this code:
    var barType: MessageBarType;
        switch (businessImpact) {
          case "MBI":
            barType = MessageBarType.warning;
            break;
          case "HBI":
            barType = MessageBarType.severeWarning;
            break;
          default:
            barType = MessageBarType.info;
        }
    

    to this code:

        // change this switch statement to suit your security classification
        var barType: MessageBarType;
        switch (businessImpact) {
          case "MBI":
            barType = MessageBarType.warning;
            break;
          case "HBI":
            barType = MessageBarType.severeWarning;
            break;
          case "LBI":
            barType = MessageBarType.info;
            break;
            default:
            barType = undefined;
        }
    
        // if no security classification, do not display a header
        if (barType === undefined) {
          return null;
        }
    

When you're done, the code should look like this:

Test your extension again, making sure to try with an LBI, MBI, and HBI site, as well as any other site that hasn't been classified yet (i.e.: that doesn't have a security classification property bag value defined yet).

Linking to handling procedures

Since the first part of this article, I have been using a fake URL instead of an actual link to handling instructions. Let's set a default URL to display proper handling procedures.

  1. Start by creating a page on your SharePoint site that explains to your users how they should properly handle information based on their security classification. You can create one page, or (ideally) create a separate set of URLs for each classification.
  2. In ClassificationHeader.types.ts, we'll add a new constant to store the URL to the new handling procedures page you created. If you created more than one, feel free to add more than one constant. If you don't want to use a hyperlink, just set it as undefined. Add this line of code, with the URL of your choice:
    export const DefaultHandlingUrl: string = "/SitePages/Handling-instructions.aspx";
    

    Remember that your URLs should be absolute (e.g.: https://yourtenant.sharepoint.com/sitepages/handling-instructions.aspx) or at least relative to the root (e.g.: /sitepages/handling-instructions.aspx), because your links will get rendered on every page in the site.

  3. Now let's change the render method in ClassificationHeader.tsx to use the handling URL in the hyperlink. Change this code:
 public render(): React.ReactElement {
    // get the business impact from the state
    let { businessImpact } = this.state;

     // change this switch statement to suit your security classification
     var barType: MessageBarType;
     switch (businessImpact) {
       case "MBI":
         barType = MessageBarType.warning;
         break;
       case "HBI":
         barType = MessageBarType.severeWarning;
         break;
       case "LBI":
         barType = MessageBarType.info;
         break;
         default:
         barType = undefined;
     }
 
     // if no security classification, do not display a header
     if (barType === undefined) {
       return null;
     }
     
    return (
      
        This site is classified as {this.state.businessImpact}. Learn more about the proper handling procedures.
      
    );
  }

to this code (note that you'll need to add an import for DefaultHandlingUrl at the top (not shown here):

public render(): React.ReactElement {
    // get the business impact from the state
    let { businessImpact } = this.state;

    // ge the default handling URL
    let handlingUrl: string = DefaultHandlingUrl;

    // change this switch statement to suit your security classification
    var barType: MessageBarType;
    switch (businessImpact) {
      case "MBI":
        // if you'd like to display a different URL per classification, override the handlingUrl variable here
        // handlingUrl = "/SitePages/Handling-instructions-MBI.aspx"
        barType = MessageBarType.warning;
        break;
      case "HBI":
        barType = MessageBarType.severeWarning;
        break;
      case "LBI":
        barType = MessageBarType.info;
        break;
      default:
        barType = undefined;
    }

    // if no security classification, do not display a header
    if (barType === undefined) {
      return null;
    }

    return (
      
        This site is classified as {this.state.businessImpact}.
        {handlingUrl && handlingUrl !== undefined ?
           Learn more about the proper handling procedures.
          : null
        }
      
    );
  }

When you're done, the code should look like this:

Localizing resources

There are a few places in our code where we display some text that is hard-coded in the code.

Being of French-Canadian origins, I am especially sensitive to the aspect of localization; you shouldn't hard-code text, dates, numbers, currencies, and images in code if you can avoid it. Not only because it makes it easier to support easily support another language, but also because it makes it easy to maintain the text in your solution without wading through code.

Flashback: I remember working on a project where the geniuses in the marketing department changed the name of the product about 17 times while we were building it. Every time, the team would have to scour through the code to change the references to the product name. Once they learned the wonders of localization and string resources, they could change all references to the product name in a few seconds (they still gave the marketing department a hard time, though) ūüôā

You only need to localize the code where something that is displayed could potentially change in a different locale. It's not just a different language, dates, numbers and currencies are displayed differently depending on where you live, even if you speak English. You don't need to worry about debugging code (e.g.: when you write to the console) unless you want people who speak in a different language to debug your code too.

Luckily, our code has only a few strings literals to worry about, and they're all in the ClassificationHeader.tsx.

You don't have to localize your code. But you should. So follow these instructions if you want to be a better SPFx developer:

  1. In the myStrings.d.ts file, located in the loc folder (source | extensions | classificationExtension | loc), add the following two lines to the
    IClassificationExtensionApplicationCustomizerStrings interface:
        "ClassifactionMessage": "This site is classified as {0}. ",
        "HandlingMessage": "Learn more about the proper handling procedures."
  2. In the en-us.js file, add two more lines below the "Title" line, making sure to add a comma at the end of the line that already exists:
    ClassifactionMessage: string;
    HandlingMessage: string;
  3. Now go to the ClassificationHeader.tsx file and add a reference to your localized strings at the top of the file, below all the other import statements:
    import * as strings from "ClassificationExtensionApplicationCustomizerStrings";
  4. Finally, replace the code in the render method to use the localized strings. Note that we're replacing the placeholder in the localization string with the classification label. We could have simply concatenated the values, but every language has a different syntax structure, and doing it this way makes it easier to deal with different language syntax.
    return (
            {strings.ClassifactionMessage.replace("{0}",this.state.businessImpact)}
            {handlingUrl && handlingUrl !== undefined ?
               {strings.HandlingMessage}
              : null
            }
        );

You code should look like this:

Optional: using configuration properties

The eagle-eyed reader may have noticed two things:

  1. There is a testMessage property that is defined in the ClassificationExtensionApplicationCustomizer.ts that we never use.
  2. The ClassificationPropertyBag, DefaultClassification, and
    DefaultHandlingUrl are all hard-coded. If you ever need to change any of the configuration items, you'd have to change the code, re-build, and re-deploy.

Thankfully, the SPFx team did a great job and designed SPFx extensions to support configuration properties. I don't know if that's what they're actually called, but that's what I call them ūüôā

The testMessage is a sample configuration property that is created for us when we use the Yeoman generator. We can replace this property to anything that suits us. In our case, the ClassificationPropertyBag, DefaultClassification, and DefaultHandlingUrl.

To do this, let's follow these steps:

  1. Open ClassificationExtensionApplicationCustomizer.ts and replace the IClassificationExtensionApplicationCustomizerProperties interface code so that it looks like this:
    export interface IClassificationExtensionApplicationCustomizerProperties {
      ClassificationPropertyBag: string;
      DefaultClassification: string;
      DefaultHandlingUrl: string;
    }
  2. In the ClassificationHeader.types.ts file, add the same properties to the IClassificationHeaderProps interface by replacing the code to this:
    export interface IClassificationHeaderProps {
        context: ExtensionContext;
        ClassificationPropertyBag: string;
        DefaultClassification: string;
        DefaultHandlingUrl: string;
    }
  3. While you're in there, make sure to remove the other definitions of ClassificationPropertyBag, DefaultClassification, and DefaultHandlingUrl.
  4. Now back in ClassificationExtensionApplicationCustomizer.ts pass the properties to the ClassificationHeader props by replacing this code:
    const elem: React.ReactElement = React.createElement(ClassificationHeader, {
            context: this.context
          });

    to this:

    const elem: React.ReactElement = React.createElement(ClassificationHeader, {
            context: this.context,
            ClassificationPropertyBag: this.properties.ClassificationPropertyBag,
            DefaultClassification: this.properties.DefaultClassification,
            DefaultHandlingUrl: this.properties.DefaultHandlingUrl
          });
    
  5. To prevent any issues from not having any configuration information, let's add some code at the top of the onInit method:
    if (!this.properties.ClassificationPropertyBag) {
          const e: Error = new Error("Missing required configuration parameters");
          Log.error(LOG_SOURCE, e);
          return Promise.reject(e);
        }
  6. Finally, find any references to ClassificationPropertyBag, DefaultClassification, or DefaultHandlingUrl in ClassificationHeader.tsx and replace them to this.props.[property]. For example, replace ClassificationPropertyBag to this.props.ClassificationPropertyBag.

When you're done, the code should look like this:

This will allow you to pass configuration properties to the extension without having to change code.

To test this:

  1. Find serve.json in the config folder.
  2. Replace the "properties" attribute to pass the configuration we need, from this:
    "properties": {
                "testMessage": "Test message"
              }
    

    to this:

    "properties": {
                "ClassificationPropertyBag": "sc_x005f_BusinessImpact",
                "DefaultClassification": "",
                "DefaultHandlingUrl":"/SitePages/Handling-instructions.aspx"
              }
  3. Launch the extension by using gulp serve and test that the extension still works.

Note: if you're planning on debugging the extension, don't forget that the URL has now changed with these new properties. Follow the instructions earlier to copy the URL to the launch.json file.

Deploying to production

Assuming that everything works, we're only a few steps away from deploying to production:

  1. When you deploy the solution that includes the extension, SharePoint looks for the default configuration in the elementx.xml and uses whatever it found.  Since we changed the default properties, let's go change the elements.xml file (you can find it in the sharepoint folder) to the following:
    <Elements xmlns="http://schemas.microsoft.com/sharepoint/">
        <CustomAction
            Title="ClassificationExtension"
            Location="ClientSideExtension.ApplicationCustomizer"
            ClientSideComponentId="4017f67b-80c7-4631-b0e5-57bd266bc5c1"
            ClientSideComponentProperties="{"ClassificationPropertyBag":"sc_x005f_BusinessImpact","DefaultClassification":"","DefaultHandlingUrl":"/SitePages/Handling-instructions.aspx"}">
        </CustomAction>
    </Elements>
    
  2. From the Terminal pane type:
    gulp bundle --ship
  3. Followed by:
    gulp package-solution --ship
  4. Navigate to your tenant's App Catalog  (e.g.: https://yourtenant.sharepoint.com/sites/apps) site and navigate to the Apps for SharePoint library.
  5. Find the folder where the package was created by going to Visual Studio Code and finding the sharepoint | solution folder, right-clicking and selecting Reveal in explorer.
  6. Drag and drop the classification-extension.sppkg solution package to the Apps for SharePoint library.

You should be able to go visit your classified sites and see the extension at work. If it doesn't work, you may have elected to not automatically deploy the solution to every site when you built the extension. If that's the case, you'll need to add the extension to the sites by using Add an App.

Conclusion

It took 5 parts to describe how to build the extension, but we successfully created an extension that reads a site's security classification from its property bag and displays the site's classification in a label.

In our article, we manually set the classification by modifying the property bag, but in the real world, we'll want to use an approach that automatically classifies sites when they are created.

The code for this application (including any modifications I may have made to it since publishing this article) can be found at: https://github.com/hugoabernier/react-application-classification.

If you're interested in seeing how we might approach automatically classification, let me know in the comments and maybe I'll create another (series of) article(s).

I hope this helps!?

 

In part 1 of this article, I introduced the concept for an SPFx extension that adds a header to every page, showing the classification information for a site.

In part 2, we created an SPFx extension that adds a header that displays a static message with the security classification of a site.

In part 3, we learned more about property bags and learned a few ways to set the sc_BusinessImpact property (a property we made up) of our test sites to LBI, MBI, and HBI.

In this part, we will finally get to add code to our extension that reads the property bag of the current site and displays the appropriate site classification label.

Reading the classification from the site's property bag

You can get the property bag of a site using a simple REST call to https://yourtenant.sharepoint.com/sites/yoursite/_api/web/allProperties  but it is even easier to use the SP PnP JS library make queries like these.

Adding the SP PnP JS library to your project

Open the Visual Studio Code solution you created in part 2 and perform the following steps:

  1. Open the terminal pane (CTRL-`).
  2. From the terminal pane, type:
    npm i sp-pnp-js --save
  3. We'll need to update the ExtensionContext in the IClassificationHeaderProps interface. It will allow the ClassificationHeader component to access the context used to make PnP calls. We'll also add a couple variables to the IClassificationHeaderState interface: one to keep the classification we'll retrieve from the property bag, and one to keep track if we're still loading the page.
    The code also defines the classification property bag name (sc_BusinessImpact) and the default classification ("LBI") for when it doesn't find a classification for a site. Feel free to change either of those values to what makes sense for your needs.
    Simply copy and paste the following code to ClassificationHeader.types.ts:
  1. Now we need to pass the ExtensionContext to the ClassificationHeader component. Open the ClassificationExtensionApplicationCustomizer.ts file and paste the following code (line 53 is the only line that was updated):
  1. Now we just need to make the ClassificationHeader component query the property bag when component mounts, save the classification in the state variable and change the render code to display the classification. Just copy the code below to ClassificationHeader.tsx:

That should be it, let's try it!

  1. From the Terminal pane in Visual Studio Code, type:
    gulp serve
  2. It should launch the browser to the page you had set up in part 2, in serve.json. If prompted to run debug scripts, accept.
  3. Assuming that the default page is not one of your LBI, MBI, or HBI test pages, you should get the default value classification (e.g.: LBI).
  4. Change the first part of the browser's URL to point to your HBI page (change the part before ?debugManifestsFile=...), and it should tell you that the site is classified HBI.
  5. Repeat step 4 with your LBI and MBI sites and make sure that you get the right messages.

If everything went well, your sites displayed the right classification, but the message bar didn't change from the default yellow warning. Let's change that.

Changing the message bar type based on the site classification

  1. Change the render method of the ClassificationHeader.tsx to display a message bar type "warning" for MBI, and "severeWarning" for HBI, and "info" for everything else. The render method should look like this:

Try the LBI, MBI, and HBI test pages again just like you did before, except this time, you should get the following:

TestMBI2
MBI Test Site
TestHBI
HBI Test Site

Help! The extension stops loading when I changed pages and it stopped prompting me if I want to load the debug scripts!

You most likely forgot to include the part after ?debugManifestsFile=… in the URL. Try to launch the extension again (gulp serve) and copy the part of the URL with the ? to your test pages.

(I know because I did this a few times)

How to debug the extension

In theory, the extension should work and load at least the default LBI message. But what if you want to debug the extension?

Here is a simple trick:

  1. Launch your extension by using gulp serve as you did above.
  2. Copy the everything in the URL from the ?. It should look like something like this:
    ?debugManifestsFile=https%3A%2F%2Flocalhost%3A4321%2Ftemp%2Fmanifests.js&loadSPFX=true&customActions=%7B%224017f67b-81c7-5631-b0e5-57bd266bc5c1%22%3A%7B%22location%22%3A%22ClientSideExtension.ApplicationCustomizer%22%2C%22properties%22%3A%7B%22testMessage%22%3A%22Test%20message%22%7D%7D%7D
  3. In your Visual Studio Code project, find launch.json under the .vscode folder.
  4. If you don't have such a file, you probably need to install the Chrome Debugger Extension for Visual Studio Code. Just go to https://aka.ms/spfx-debugger-extensions and follow the instructions to install it.
  5. Find the configuration entry that starts with "name": "Hosted Workbench" and paste the ugly URL you got in step 2 at the end of the URL marked "url". This will add the instructions to load the extension in debug mode.
  6. From the Terminal pane, type:
    gulp serve --nobrowser
  7. This will start the local web server but won't launch the browser.
  8. Set a few breakpoints where you want to debug the code by using F9. For example, the render method of the ClassificationHeader component.
  9. From the Debug menu in Visual Studio Code, select Start Debugging and it should launch Chrome to the page you specified in launch.json, prompt you to login, then prompt you to run Debug scripts. Accept and you should be able to debug through the code.

This should be all for today. Next part of this article will clean up some of the code, add localized strings, and prepare the code for production and deploy it!.

 

In part 1 of this article, I introduced the concept for an SPFx extension that adds a header to every page, showing the classification information for a site. In part 2, we created an SPFx extension that adds a header that displays a static message with the security classification of a site.

Yes, static. As in hard-coded. I try to write these articles for people who don't have as much experience with developing SPFx extensions, so I included the step-by-step instructions.

In this article, we'll discuss how we use property bags to store the security classification.

What are property bags anyway?

Property bags is a term used when describing a serialized list of properties. It isn't unique to SharePoint -- I remember using them in the good old C days, but SharePoint has been using them for a long time. Remember this screen from SharePoint Designer?

AncientBag

Property bags are a convenient way to store a whole bunch of properties of things. In SharePoint, a property bag can be applied to the File, Folder, List or Web-level in SharePoint. When set at the Web level, it can be for a Site Collection or Site -- at least that's what MSDN said about SharePoint 2013.

The great thing about property bags in SharePoint is that they are attributes of their parent, which means they are protected the same way their parents are.

In theory, you could use a custom SharePoint list, add it to every site, manage the permissions, and add one row per property you want to store about each site, but that would be painful.

You could also store an XML or JSON file in every site that does the same, but then you'd have to write the code to create and store the file, protect it, and read it.

...or you could use the out-of-the-box mechanism to store metadata about a site, and let SharePoint create it and protect it. Also, you could use the countless ways to access the property bags (SharePoint designer, PowerShell, CSOM, PnP JS, Office 365 CLI, etc.).

So, for our Classification extension, we'll store and read from the site's property bag.  To pay a homage to Microsoft's own solution to Implement a SharePoint site classification solution, we'll use sc_BusinessImpact for the property name. You could name it anything you want, but you probably want to keep it somewhat unique.

Here is what the property bag looks like in SharePoint Designer 2013:

PropertyBagSharePoint

Storing custom properties in site property bags

In the previous article, I asked you to create test sites for LBI, MBI, and HBI tests. Now we'll store the values LBI, MBI, and HBI in the sc_BusinessImpact property in each respective site's property bags.

There are a few ways to do this, but since this is just for testing purposes, I'll offer two ways to cheat.

Setting a custom property using SharePoint Design 2013

Yes, SharePoint Designer 2013 is still around. and it works with Office 365! What's more, you can use it to easily set custom property bag values using it!

  1. Using SharePoint Designer 2013, go to File | Open SharePoint Site and type the URL to your LBI site you created in the previous article in the Site name field.
  2. Once connected, select Site Options from the toolbar.SiteOptions
  3. On the Parameters tab in the Site Options dialog, you'll see the list of properties in the property bag. Don't mess with them.
    SiteOptionsNoPrp
  4. Select Add... to add a new property.
  5. In the Add Name and Value dialog box, type sc_BusinessImpact in the Name field, and LBI in the Value field. Select OK.
    SiteOptionsAdd
  6. Back on the Site Options dialog, you should see the new property you created. Select OK to dismiss the Site Options dialog.
  7. Repeat steps 1-6 with your MBI and HBI site, making sure to use MBI and HBI, respectively, in the Value field for step 5.

Storing custom properties using the Chrome SharePoint Editor Extension

If you haven't installed it yet, the Chrome SharePoint Editor Extension is a wonderful Chrome Extension that makes it easy to manage property bags. This is how to use it.

  1. Using Chrome, browse to your LBI site.
  2. Hit F12 or CTRL-SHIFT-I to open the Developer Tools.
  3. Find the SharePoint tab (should be one of the last ones, after Audit).
  4. From the Chrome SharePoint Editor navigation, select Web properties.
  5. In the New Property Name field, type sc_BusinessImpact
  6. In the New Property Value field, type LBI
  7. Select Add Property to submit your changes.PropertyBagusingspeditor.png
  8. You should see a toast notification at the bottom right of the screen indicating it worked.
  9. Repeat steps 1-8 with your MBI and HBI site.

What to do if you get errors setting the property bag values

It is possible that you run into an issue where SharePoint actively refuses to set the property bag. To resolve this issue, you need to temporarily set DenyAddAndCustomizePages to 0 on each site. To do so:

  1. Launch the SharePoint Online Management Shell.
  2. From the command-line, type:
    Connect-SPOService
  3. When prompted for it, enter the URL to your admin site (e.g.: https://mytenant-admin.sharepoint.com) and hit Enter.
  4. You'll most likely be prompted to log-in. Enter your credentials.
  5. Once connected, type the following, making sure to enter the URL to your LBI site:
    Set-SPOSite https://yourtenant.sharepoint.com/sites/testlbi -DenyAddAndCustomizePage 0
  6. Repeat the previous step with your MBI and HBI site URLs, then try again one of the two methods to set your site property bags.

If you wish to do so, you can re-run the above commands setting DenyAddAndCustomizePages to 1 after you're done setting your property bag values. Thanks to Asish Padhy for the inspiration to set DenyAddAndCustomizePages.

You may think "Bah, I can just go to the SharePoint Admin site, and go to the settings, and enable this", but as My SharePoint Log pointed out, you'll have to wait up to 24 hours for this to take effect.

Part III Conclusion

There are plenty of other methods to set property bag values, but the ones I listed above seemed the easiest.

I didn't spend too much time on how to set up the values because, in a real-world scenario, you shouldn't be setting the security classification property bag value by hand. It should be automatically configured when the site is created.

That's something we'll get to that much later. For now, we'll focus on changing our hard-coded message bar and make it display the actual site classification.

In the next part of this article, we'll finally return to code and retrieve the site classification from the property bags and display the appropriate message.

In part 1 of this article, I introduced the concept for an SPFx extension that adds a header to every page, showing the classification information for a site.

We'll actually do the coding in this article!

Creating the SPFx extension solution

  1. Using the command line, create a new project directory
md classification-extension
  1. Change the current directory to your new project directory
cd classification-extension
  1. Launch the Yeoman SharePoint Generator:
yo @Microsoft/sharepoint
  1. When prompted for the solution name, accept the default classification-extension.
  2. For the baseline package select SharePoint Online only (latest).
  3. When asked Where do you want to place the files? accept the default Use the current folder.
  4. When asked if you want to allow the tenant admin the choice of being able to deploy the solution to all sites immediately respond Yes (unless you really want to deploy it to every single site manually).
  5. When asked for the type of client-side component to create select Extension.
  6. Select Application Customizer when asked about Which type of client-side extension to create.
  7. Almost there. For Application Customizer name, use ClassificationExtension. Keep this name to less than 40 characters always.
  8. For Application Customizer description, enter Displays the site's information security classification.
  9. What the miracle that is Yeoman creating the project for you. It'll take a few minutes. Eventually, it'll say Congratulations! Solution classification-extension is created. Run gulp serve to play with it!. We're not quite ready, yet.

Adding a static header

Now that the solution is created, we'll quickly add a header to text that our extension is working. We'll add the dynamic code later.

  1. Launch Visual Studio Code and open the new project you created. From the command line, type:
code .
  1. We could add code to directly manipulate the DOM and insert elements, but I prefer keeping my components in separate .TSX files. It keeps everything simple (because every component is responsible for only one thing), which makes my brain happy. It also keeps everything modular. From your project's file explorer pane, navigate to src | extensions | classificationExtension
  2. Right-click and select New Folder.
    AddingaFolder
  3. Type components as the folder name.
  4. On the newly created folder, right-click and select New File.
  5. Name the new file ClassificationHeader.types.ts. This file will contain all the types that the Footer component (to be created soon) will use.
  6. In the ClassificationHeader.types.ts file, paste the following (placeholder) code:

7. Now right-click the components folder and select New File. Name your new file ClassificationHeader.tsx.

8. Paste the following code in your ClassificationHeader.tsx.

9. Finally, find the ClassificationExtensionApplicationCustomizer.ts file that was created by Yeoman and replace its content with the following code:

What the code does:

  • ClassificationExtensionApplicationCustomizer.ts:¬†looks if there is a placeholder available called "Top". If there is, it calls the ClassificationHeader.tsx component to render. You are never supposed to assume that a placeholder is there, so check every time.
  • ClassificationHeader.tsx: renders a static/hard-coded Office UI Fabric MessageBar that says the site is MBI, and provides a fake link.
  • ClassificationHeader.types.ts: defines a property and state interface for the ClassificationHeader component. Right now, both are empty but we'll add some fields in future versions of this code.

Testing that the extension works

Unlike SPFx web parts, you can't text your extensions in the SPFx Workbench. I hope that it'll be fixed in future versions of the workbench, but until then you need to test it on a real page on your Office 365 tenant.

Here is how to test your extension:

    1. In Visual Studio Code, find serve.json (located in the config folder).
    2. Find an entry that looks like https://contoso.sharepoint.com/sites/mySite/SitePages/myPage.aspx and replace it to the url to a test page on your Office 365 tenant. For example: https://yourtenant.sharepoint.com/SitePages/Test-extension.aspx. There should be two instances to replace.
    3. From the Terminal pane (hit CTRL-`) type:
      gulp serve
    4. After a few moments, your favourite browser should launch and you should get a scary warning:DebugScriptWarning
    5. Select Load debug scripts and the page should load with our fancy message bar at the top.
      TestMBI

 

I would consider that a success! Except, of course, that the extension is hard-coded to say that the site is classified as MBI.

But first, we need to create some test sites and classify them.

Creating test sites

In your Office 365 tenant, create three new sites. You can use the Communication or Team site template, as long as you use a modern template.

The three sites will be:

  • TestLBI
  • TestMBI
  • TestHBI

You can use any naming convention you'd like, just make note of the urls for each site because you'll need them in the next step.

We'll set the property bags on each of the three testing sites, but -- unfortunately -- it'll have to be in the next article.

 

 

 

Value proposition

As an independent consultant, I get to work with a lot of organizations in both public and private sectors. Most deal with various levels of security classification.

Governance is always a hot topic with SharePoint. Most understand the importance of governance; some shrug it off as a "we'll deal with it when it becomes a problem" -- which is never a good idea, as far as I'm concerned.

But what if we could make applying governance in SharePoint a lot easier? So easy, in fact, that it would be more painful to deal with it when it becomes a problem.

That's what I hope to do with this series of blog articles: demonstrate easy ways to introduce some level of governance using new enabling technologies -- like SPFx web parts, extensions, and site scripts.

My goal is not to duplicate the work of Microsoft and others; I may use a very simple approach in this first blog to keep the example easy to understand, but I fully intend on leveraging out-of-the-box Office 365 features like Data Loss Prevention (DLP) features.

I hope you'll stick with me for the journey!

Information security classification

Information security classification or information classification is a step in the process of managing information. There are people who are way smarter about this topic, and there is a whole ISO 27001 standard on the topic, so I'll avoid a detailed explanation.

…But I'll definitely throw in a gratuitous graphic. I guess my time McKinsey & Company rubbed off on me.

Managing classified information typically consists of 4 steps:

  • Asset inventory: finding out what kind of information your organization has, and who is responsible for it.
  • Information classification: identifying how sensitive the information is. How bad would it be if this information was leaked, it's integrity compromised, etc. There is no one way to classify information -- it depends on your organization size, industry, country, etc. The most frequently use examples are:
    • Confidential: top confidentiality level
    • Restricted: medium confidentiality level
    • Internal use: lowest level of confidentiality
    • Public: everyone can see the information
  • Information labelling: you kinda need to tell your employees how the information is classified so that they can handle it properly.
  • Information handling: where you define rules and processes around how to handle the information.

This article will focus on the information handling part of the process.

Microsoft's information classification

Microsoft internally classifies their information as follows:

    • High Business Impact (HBI): Authentication / authorization credentials (i.e., usernames and passwords, private cryptography keys, PIN‚Äôs, and hardware or software tokens), and highly sensitive personally identifiable information (PII) including government-provided credentials (i.e. passport, social security, or driver‚Äôs license numbers), financial data such as credit card information, credit reports, or personal income statements, and medical information such as records and biometric identifiers.
    • Moderate Business Impact (MBI): Includes all personally identifiable information (PII) that is not classified as HBI such as: Information that can be used to contact an individual such as name, address, e-mail address, fax number, phone number, IP address, etc; Information regarding an individual‚Äôs race, ethnic origin, political opinions, religious beliefs, trade union membership, physical or mental health, sexual orientation, commission or alleged commission of offenses and court proceedings.
    • Low Business Impact (LBI): Includes all other information that does not fall into the HBI or MBI categories.

A while ago, Microsoft also released on GitHub some cool solution to apply their classification on SharePoint sites.  They also have a great case study that shows how they approached classification on their own content.

So, since I want to keep things simple, I'll use HBI, MBI, and LBI classification labels in my example. You can use your own classification if you want.

Using SPFx extensions to add a header

If you read my equally long post on creating SPFx extensions, you'll know that you can use SPFx extensions to do cool things on every page of a site. To keep this example really simple, I'll create a header that reads the site's property bag and displays a very simple Office Fabric UI Message Bar indicating the site's classification. It isn't going to be particularly pretty, but we can improve on looks later.

The bar will say "This site is classified as [LBI|MBI|HBI]. Learn more about the proper handling procedures.", but you can make it say whatever is appropriate for you.

Here is what the HBI header will look like:
HBI header

The MBI header:
MBI header

And the LBI header:
LBI header

In the next article, we'll start writing the code.

 

Value proposition

As an independent consultant, I get to work with a lot of organizations in both public and private sectors. Most deal with various levels of security classification.

Governance is always a hot topic with SharePoint. Most understand the importance of governance; some shrug it off as a "we'll deal with it when it becomes a problem" -- which is never a good idea, as far as I'm concerned.

But what if we could make applying governance in SharePoint a lot easier? So easy, in fact, that it would be more painful to deal with it when it becomes a problem.

That's what I hope to do with this series of blog articles: demonstrate easy ways to introduce some level of governance using new enabling technologies -- like SPFx web parts, extensions, and site scripts.

My goal is not to duplicate the work of Microsoft and others; I may use a very simple approach in this first blog to keep the example easy to understand, but I fully intend on leveraging out-of-the-box Office 365 features likeData Loss Prevention (DLP) features.

I hope you'll stick with me for the journey!

Information security classification

Information security classification or information classification is a step in the process of managing information. There are people who are way smarterabout this topic, and there is a whole ISO 27001 standard on the topic, so I'll avoid a detailed explanation.

…But I'll definitely throw in a gratuitous graphic. I guess my time McKinsey & Company rubbed off on me.

Managing classified information typically consists of 4 steps:

  • Asset inventory: finding out what kind of information your organization has, and who is responsible for it.
  • Information classification: identifying how sensitive the information is. How bad would it be if this information was leaked, it's integrity compromised, etc. There is no one way to classify information -- it depends on your organization size, industry, country, etc. The most frequently use examples are:
    • Confidential: top confidentiality level
    • Restricted: medium confidentiality level
    • Internal use: lowest level of confidentiality
    • Public: everyone can see the information
  • Information labelling: you kinda need to tell your employees how the information is classified so that they can handle it properly.
  • Information handling: where you define rules and processes around how to handle the information.

This article will focus on the information handling part of the process.

Microsoft's information classification

Microsoft internally classifies their information as follows:

  • High Business Impact (HBI): Authentication / authorization credentials (i.e., usernames and passwords, private cryptography keys, PIN‚Äôs, and hardware or software tokens), and highly sensitive personally identifiable information (PII) including government-provided credentials (i.e passport, social security, or driver‚Äôs license numbers), financial data such as credit card information, credit reports, or personal income statements, and medical information such as records and biometric identifiers.
  • Moderate Business Impact (MBI): Includes all personally identifiable information (PII) that is not classified as HBI such as: Information that can be used to contact an individual such as name, address, e-mail address, fax number, phone number, IP address, etc; Information regarding an individual‚Äôs race, ethnic origin, political opinions, religious beliefs, trade union membership, physical or mental health, sexual orientation, commission or alleged commission of offenses and court proceedings.
  • Low Business Impact (LBI): Includes all other information that does not fall into the HBI or MBI categories.

A while ago, Microsoft also released on GitHub somecool solution to apply their classification on SharePoint sites.  They also have a great case study that shows how they approached classification on their own content.

So, since I want to keep things simple, I'll use HBI, MBI, and LBI classification labels in my example. You can use your own classification if you want.

Using SPFx extensions to add a header

If you read my equally long post on creating SPFx extensions, you'll know that you can use SPFx extensions to do cool things on every page of a site. To keep this example really simple, I'll create a header that reads the site's property bag and displays a very simple Office Fabric UI Message Bar indicating the site's classification. It isn't going to be particularly pretty, but we can improve on looks later.

The bar will say "This site is classified as [LBI|MBI|HBI]. Learn more about the proper handling procedures.", but you can make it say whatever is appropriate for you.

Here is what the HBI header will look like:
HBI header

The MBI header:
MBI header

And the LBI header:
LBI header

Ready? Let's get coding!

Creating the SPFx extension solution

  1. Using the command line, create a new project directory
md classification-extension
  1. Change the current directory to your new project directory
cd classification-extension
  1. Launch the Yeoman SharePoint Generator:
yo @Microsoft/sharepoint
  1. When prompted for the solution name, accept the default classification-extension.
  2. For the baseline package select SharePoint Online only (latest).
  3. When asked Where do you want to place the files? accept the default Use the current folder.
  4. When asked if you want to allow the tenant admin the choice of being able to deploy the solution to all sites immediately respond Yes (unless you really want to deploy it to every single site manually).
  5. When asked for the type of client-side component to create select Extension.
  6. Select Application Customizer when asked aboutWhich type of client-side extension to create.
  7. Almost there. For Application Customizer name, use ClassificationExtension. Keep this name to less than 40 characters always.
  8. For Application Customizer description, enter Displays the site's information security classification.
  9. What the miracle that is Yeoman creating the project for you. It'll take a few minutes. Eventually, it'll say Congratulations! Solution classification-extension is created. Run gulp serve to play with it!. We're not quite ready, yet.
  10. Let's launch Visual Studio Code and open the new project you created. From the command line, type:
code .

§§cs§§

An awesome part of SPFx is the ability to create SharePoint Framework Extensions. At the time of this writing, you can write three types of SPFx extensions:

  • Application customizers: to add scripts to pages and access HTML to predefined (well-known) HTML elements. At the moment, there are only a few page placeholders (like headers and footers), but I'm sure the hard-working SPFx team will announce new ones soon enough. For example, you can add your own customized copyright and privacy notices at the bottom of every modern page.
  • Field customizers: to change the way fields are rendered within a list. For example, you could render your own sparkline chart on every row in a list view.
  • Command sets: to add commands to list view toolbars. For example, you could add a button to perform an action on a selected list item.

This articles doesn't try to explain how to create extensions -- there are many great examples on the SharePoint Framework Extensions Samples & Tutorial Materials GitHub repo, and the Overview of SharePoint Framework Extensions tutorial is a pretty place to start if you haven't played with extensions.

In this article, I'll share a Powershell script I use to deploy to many sites at once.

But first, a few things you need to know:

  • To deploy an extension, you need to first deploy the solution (.sppkg) containing the extension, then add a custom user action to your site, web, or list. In other words, tell the site, web, or list to use the extension that you deployed in the solution. There are no user interfaces to add custom user actions.
  • When you add a custom user action, you can pass configuration properties to your extension.
  • It is possible to add a custom user action to the same site, web, or list more than once (because you¬†could¬†pass different configuration properties every for every instance).
  • You can also specify a JSON file in your solution that will automatically deploy and add the custom user action, but you can't customize the configuration properties.

When you have a SharePoint tenant with lots and lots of sites, and you need to provide different configuration properties for each site, it can become painful to deploy an extension everywhere.

Sure, the solution deployment step is easy, just make sure that your solution-package.json has "skipFeatureDeployment": true, and SharePoint will kindly offer to automatically deploy your solution to every site for you.

But to add an extension as a custom user action and provide configuration properties, you need to call a command or use some scripts:

When I need to do just one site, I'll often use the SPFx-extensions-cli, but when I need to do a whole bunch of sites, I like to use the PnP PowerShell cmdlets and PowerShell.

The idea came from the RegionsFooterProvisionCustomizer.ps1 script on Paolo Pialorsi's awesome Regions Footer Application Customizer example, which goes like this:

$credentials = Get-Credential
Connect-PnPOnline "https://.sharepoint.com/sites/" -Credentials $credentials

$context = Get-PnPContext
$web = Get-PnPWeb
$context.Load($web)
Execute-PnPQuery

$ca = $web.UserCustomActions.Add()
$ca.ClientSideComponentId = "67fd1d01-84e8-4fbf-85bd-4b80768c6080"
$ca.ClientSideComponentProperties = "{""SourceTermSetName"":""Regions""}"
$ca.Location = "ClientSideExtension.ApplicationCustomizer"
$ca.Name = "RegionsFooterCustomAction"
$ca.Title = "RegionsFooterCustomizer"
$ca.Description = "Custom action for Regions Footer Application Customizer"
$ca.Update()

$context.Load($web.UserCustomActions)
Execute-PnPQuery

Now Paolo's script will only work for his extension, but you can easily go in and change the ClientSideComponentId, ClientSideComponentProperties, Name, Title and Description and make it your own. And if you mistakenly re-run the script for the same site twice, the extension will appear twice.

But I wanted to repeat this for each site on one of my tenant's bazillion sites, and provide different configuration properties -- if necessary. I also wanted to be able to re-run the script as many times as I wanted. Finally, I wanted the customer to be able to simply provide a CSV with a list of sites where they wanted the extensions applied.

So I made tweaked Paolo's code to read the list of sites from aCSV file and apply the extension to each site.¬†I borrowed a lot of this script from another example on the¬†SharePoint Framework Extensions Samples & Tutorial Materials GitHub repo, but I can't find it anymore, so I can't tell who I should give the credit to.¬† Let me know in the comments if you know who deserves the credits. I'm lazy, but I'm not a thief ūüôā

First, make sure that you install the PnP PowerShell cmdlets on your workstation.

Then create a new PowerShell file and copy this code into it:


$credentials = Get-Credential

# Import the list of sites where we want to apply 
$sitesToProcess = import-csv "sites.csv"

# details of custom action/SPFx extension
[guid]$spfxExtId = "[extension id goes here]"
[string]$spfxExtName = "[extension name goes here]"
[string]$spfxExtTitle = "[extension title goes here]"
[string]$spfxExtGroup = "[extension group goes here]"
[string]$spfxExtDescription = "[extension description goes here]"
[string]$spfxExtLocation = "ClientSideExtension.ApplicationCustomizer"
[string]$spfxExtension_Properties = "[properties JSON goes here]"

function Add-CustomActionForSPFxExt ([string]$url, $clientContext) {
    Write-Output "-- About to add custom action to: $url"

    $rootWeb = $clientContext.Web
    $clientContext.ExecuteQuery()
    $customActions = $rootWeb.UserCustomActions
    $clientContext.Load($customActions)
    $clientContext.ExecuteQuery()

    $custAction = $customActions.Add()
    $custAction.Name = $spfxExtName
    $custAction.Title = $spfxExtTitle
    $custAction.Description = $spfxExtDescription
    $custAction.Location = $spfxExtLocation
    $custAction.ClientSideComponentId = $spfxExtId
    $custAction.ClientSideComponentProperties = $spfxExtension_Properties
    $custAction.Update()
    $clientContext.ExecuteQuery()

    Write-Output "-- Successfully added extension" 	
	
    Write-Output "Processed: $url"
}
function Remove-CustomActionForSPFxExt ([string]$extensionName, [string]$url, $clientContext) {
    Write-Output "-- About to remove custom action with name '$($extensionName)' from: $url"

    $actionsToRemove = Get-PnPCustomAction -Web $clientContext.Web | Where-Object {$_.Location -eq $spfxExtLocation -and $_.Name -eq $extensionName }
    Write-Output "-- Found $($actionsToRemove.Count) extensions with name $extensionName on this web." 	
    foreach ($action in $actionsToRemove) {
        Remove-PnPCustomAction -Identity $action.Id
        Write-Output "-- Successfully removed extension $extensionName from web $url." 	
    }

    Write-Output "Processed: $url"
}

# -- end functions --

foreach ($site in $sitesToProcess) {
    $ctx = $null
    $url = $site.Url
    try {
        Connect-PnPOnline -Url $url -Credentials $credentials
        Write-Output ""
        Write-Output "Authenticated to: $url"
        $ctx = Get-PnPContext
    }
    catch {
        Write-Error "Failed to authenticate to $url"
        Write-Error $_.Exception
    }

	# Make sure have a context before continuing
    if ($ctx) {
		# Find out if the extension is already added
		$existingActions = Get-PnPCustomAction -Web $ctx.Web | Where-Object {$_.Location -eq $spfxExtLocation -and $_.Name -eq $spfxExtName }
		
		# Count how many existing extensions we found
		$count = $($existingActions.Count)
		
		# Don't re-install extension if it is already there
        if ($count -ge 1) {
			#This assumes that you don't want to duplicate extensions. If you do, feel free to change the logic below
            if ($count -eq 1) {
                Write-Output "Extension is already applied"
            }
            else {
                Write-Warning "Extension is duplicated!"
            }
        }
        else {
			# Add the extension
			Add-CustomActionForSPFxExt $url $ctx
			Write-Output "-- Successfully added extension $spfxExtName to web $url."
        }
		
        #Add-CustomActionForSPFxExt $url $ctx
        #Remove-CustomActionForSPFxExt $spfxExtName $site $ctx
        #Get-PnPCustomAction -Web $ctx.Web | Where-Object {$_.Location -eq "ClientSideExtension.ApplicationCustomizer" }
    }
}

Making sure to replace all the [sections in bold] with your own information. I get the name and id from the extension's manifest.json file.

Then, create a CSV file containing all the sites you want to get the extension. It should look like this:

Url
https://yourtenantgoeshere.sharepoint.com/sites/Employee
https://yourtenantgoeshere.sharepoint.com/sites/Employee/About
https://yourtenantgoeshere.sharepoint.com/sites/Employee/Calendars
https://yourtenantgoeshere.sharepoint.com/sites/Employee/Learning
https://yourtenantgoeshere.sharepoint.com/sites/Employee/FAQs
https://yourtenantgoeshere.sharepoint.com/sites/Employee/Learning
https://yourtenantgoeshere.sharepoint.com/sites/Employee/News
https://yourtenantgoeshere.sharepoint.com/sites/Employee/InformationTechnology
https://yourtenantgoeshere.sharepoint.com/sites/Employee/MarketingAndCommunications
https://yourtenantgoeshere.sharepoint.com/sites/Employee/Security
https://yourtenantgoeshere.sharepoint.com/sites/Employee/EnvironmentalSustainability
https://yourtenantgoeshere.sharepoint.com/sites/Employee/HealthAndSafety
https://yourtenantgoeshere.sharepoint.com/sites/Employee/Fundraising
https://yourtenantgoeshere.sharepoint.com/sites/Employee/Glossary
https://yourtenantgoeshere.sharepoint.com/sites/Employee/Parking
https://yourtenantgoeshere.sharepoint.com/sites/Employee/purchasing

Using your own urls, and saving it as sites.csv in the same folder as the PowerShell script.

Then you can run the script and it'll connect to every site and apply the extension and provide the configuration properties, but only if the extension hasn't already been installed.

You could also tweak the script and the CSV to pass different configuration properties for each site, but I'll reserve it for another post.

Leave me a comment if you'd like me to post it.

I hope it helps!

As the World's Laziest Developer, I don't like to invent anything new if I can find something that already exists (and meets my needs).

This article is a great example of that mentality. I'm really standing on the shoulder of giants and combining a few links and re-using someone else's code (with credit, of course) to document what my approach to versioning SPFx packages is, with the hope that it helps someone else.

CHANGELOG.md: a standard way to communicate changes that doesn't suck

The problem with change logs

There are a few ways to communicate changes when working on a project: you can use your commit log diffs, GitHub Releases, use your own log, or any other standard out there.

The problem with commit log diffs is that, while comprehensive, they are an automated log of changes that include every-single-change. Log diffs are great for documenting code changes, but if you have a team of developers merging multiple commits every day between versions, they aren't great at summarizing the noteworthy differences.

GitHub Releases solves a part of this problem by making it easy to manually (or automatically) creating release notes with git tags. (f you haven't looked into GitHub Releases, it is awesome --  take a look!.

However, GitHub Releases is still not very user-friendly (or manager-friendly).

You can always write your own change log format, but why not adopt a format and structure that you can use consistently across projects & teams?

CHANGELOG.md

This is where CHANGELOGs come in. According to Olivier Lacan at KeepAChangeLog.com, a changelog is...

"a file which contains a curated, chronologically ordered list of notable changes for each version of a project."

Changelogs use the markdown syntax to make it easy to maintain. They follow a few principles (again, credit to KeepAChangeLog.com):

  • They are for¬†humans not machines: they should be easy to read and quickly make sense of relevant changes.
  • There should be an entry on every single version:
    • Latest version comes first: List versions in reverse-chronological order, makes it easier to see what matters.
    • Release date of each version is displayed: use a consistent ISO standard date format (e.g.: 2018-04-16).
    • Versions should be linkable:¬†becomes handy when you have a giant changelog. Just wrap your version number with [] (e.g.: [0.0.1]).
    • Changes should be grouped by type of change: group you changes into¬†Added, Changed, Deprecated, Removed, Fixed, and Security. Only include the groups of change types you have (no need to have a¬†Deprecated¬†section if you don't have any deprecated-type changes).
  • Mention whether you follow Semantic Versioning: You should, by the way.

How to use CHANGELOG.md in your SPFx project

  1. Add a new file in your project -- wherever you put your README.md) and call it CHANGELOG.md.
    (Sure, you can name your changelog whatever you want, but the whole point of a changelog is to make it easy to find the changes on any projects, consistently. Just name it CHANGELOG.md. Trust me.)
  2. Paste this template in the new file you created:
All notable changes to this project will be documented in this file.

The format is based on [Keep a Changelog](http://keepachangelog.com/en/1.0.0/)
and this project adheres to [Semantic Versioning](http://semver.org/spec/v2.0.0.html).

## [Unreleased]
### Added

- (List new added features)

### Changed

- (List changes to existing functionality)

### Deprecated

- (List soon-to-be removed features)

### Removed

- (List features removed in this version)

### Fixed

- (List bugs fixed in this version)

### Security

- (List vulnerabilities that were fixed in this version)
  1. As you work, keep a log of your changes in the Unreleased section, making sure to put the changes under their respective change types. If you want, you can even link to commits, but I don't.
  2. When you change your solution version, create a new section version entry below the Unreleased section. For example, for version 0.0.1 created April 16, 2018, insert the following text below the unreleased version:

## [0.0.1] - 2018-04-16

Remember that not everyone is an American-born, native English speaker. Use the ISO Standard format for dates. The French-Canadian in me thanks you.

  1. Copy all the changes from Unreleased to your new version section, making sure to remove any empty change type sections. For example, if you don't have any deprecated changes, remove the ### Deprecated section.
  2. This is what the final version of your CHANGELOG.md would look like:
All notable changes to this project will be documented in this file.

The format is based on [Keep a Changelog](http://keepachangelog.com/en/1.0.0/)
and this project adheres to [Semantic Versioning](http://semver.org/spec/v2.0.0.html).

## [Unreleased]

## [0.0.1] - 2018-04-16

### Added
- (List new added features)

### Changed
- (List changes to existing functionality)

### Removed
- (List features removed in this version)

### Fixed
- (List bugs fixed in this version)

### Security
- (List vulnerabilities that were fixed in this version)
  1. Copy back the section templates to the Unreleased section and continue steps 3-7 with every new version.

Semantic versioning

I have worked with Microsoft technologies as long as I can remember, so it is ingrained in me that every version number should consist of 4 parts: Major, Minor, Build, Revision. For example, 1.0.0.0.

When you package an SPFx solution, the solution version always starts with version 1.0.0.0, and you can't make it lower than that. (Well, you can, but SharePoint will ignore it and it will become version 1.0.0.0).

Imagine my horror when, one day, I was trying to change the version number of a solution and searched for 1.0.0 and found that the NodeJS package also has its own version, stored in a file called package.json. What's worse, it didn't even have 4 parts!

The heresy!

After my initial indignation, I decided to research this and found that the versioning schema is called Semantic Versioning (or sem-ver for short). It consists of three mandatory parts: Major, Minor, Patch, plus an optional label for pre-release and build metadata. For example, you could have a version 1.0.0-rc for a release candidate version.

Hmmm, makes it easier to keep track of versions. And it is more human-readable, which is always good.

To keep things even more confusing, each web part can have its own version number. While there are valid reasons why you would want to keep the package version, the solution version and the web part versions separate, it quickly becomes impossible to keep track of versions.

To keep things clean, it makes sense to keep version numbers in sync.

npm version

Luckily, makes it easy to update your package.json version by simply calling:

npm version <major|minor|patch>

Where you specify to increase either the major, minor, or patch version.

For example, if you start with a package.json version 0.0.3 and want to increase the major version, you'd call:

npm version major

Which would produce v1.0.0.

If only there was a way to make it this easy to synchronize the package.json version to the package-solution.json version.

If only someone way smarter than I had thought of this...

Sync npm version with package-solution.json

It turns out there is such a person: Stefan Bauer!

In his blog post, he shares a way to add a Gulp function that automatically syncs the package.json version with the package-solution.json.

(Thanks Stefan for being awesome!)

To add this Gulp function, do the following steps:

  1. In your SPFx project, open gulpfile.js
  2. Before build.initialize(gulp); add my slightly modified version of Stefan's code. If it works, credit goes to Stefan. If it fails, it was my changes.
    let syncVersionsSubtask = build.subTask('version-sync', function (gulp, buildOptions, done) {
      this.log('Synching versions');
    
      // import gulp utilits to write error messages
      const gutil = require('gulp-util');
    
      // import file system utilities form nodeJS
      const fs = require('fs');
    
      // read package.json
      var pkgConfig = require('./package.json');
    
      // read configuration of web part solution file
      var pkgSolution = require('./config/package-solution.json');
    
      // log old version
      this.log('package-solution.json version:\t' + pkgSolution.solution.version);
    
      // Generate new MS compliant version number
      var newVersionNumber = pkgConfig.version.split('-')[0] + '.0';
    
      if (pkgSolution.solution.version !== newVersionNumber) {
        // assign newly generated version number to web part version
        pkgSolution.solution.version = newVersionNumber;
    
        // log new version
        this.log('New package-solution.json version:\t' + pkgSolution.solution.version);
    
        // write changed package-solution file
        fs.writeFile('./config/package-solution.json', JSON.stringify(pkgSolution, null, 4));
      }
      else {
        this.log('package-solution.json version is up-to-date');
      }
      done();
    });
    
    let syncVersionTask = build.task('version-sync', syncVersionsSubtask);
    
    build.rig.addPreBuildTask(syncVersionTask);
  3. Save your file

The final gulpfile.js should look like this:

'use strict';

const gulp = require('gulp');
const build = require('@microsoft/sp-build-web');

build.addSuppression(`Warning - [sass] The local CSS class 'ms-Grid' is not camelCase and will not be type-safe.`);

//BEGIN: Added code for version-sync
let syncVersionsSubtask = build.subTask('version-sync', function (gulp, buildOptions, done) {
  this.log('Synching versions');

  // import gulp utilits to write error messages
  const gutil = require('gulp-util');

  // import file system utilities form nodeJS
  const fs = require('fs');

  // read package.json
  var pkgConfig = require('./package.json');

  // read configuration of web part solution file
  var pkgSolution = require('./config/package-solution.json');

  // log old version
  this.log('package-solution.json version:\t' + pkgSolution.solution.version);

  // Generate new MS compliant version number
  var newVersionNumber = pkgConfig.version.split('-')[0] + '.0';

  if (pkgSolution.solution.version !== newVersionNumber) {
    // assign newly generated version number to web part version
    pkgSolution.solution.version = newVersionNumber;

    // log new version
    this.log('New package-solution.json version:\t' + pkgSolution.solution.version);

    // write changed package-solution file
    fs.writeFile('./config/package-solution.json', JSON.stringify(pkgSolution, null, 4));
  }
  else {
    this.log('package-solution.json version is up-to-date');
  }
  done();
});

let syncVersionTask = build.task('version-sync', syncVersionsSubtask);

build.rig.addPreBuildTask(syncVersionTask);
//END: Added code for version-sync

build.initialize(gulp);

Next time you build your package, the Gulp task version-sync will grab the package.json version (which you updated using npm version, right?) and will update package-solution.json, adding an extra zero at the end of the version number to Microsoftify the version.

When you get the version number, go update your CHANGELOG.md file by moving the changes from [unreleased] to a new section with the new version number you just created.

Sync package-solution.json version with webpart.manifest.json version

So far, we have done the following:

  • Created a CHANGELOG.md¬†of unreleased changes
  • Maintained version number using¬†npm version
  • Synchronized¬†package.json¬†versions with¬†package-solution.json¬†versions
  • Updated your¬†CHANGELOG.md to describe the changes you made

But there is still a little annoying thing: the web part versions (stored in webpart.manifest.json,  where webpart is the name of your web part) can be different than the package.json and package-solution.json.

Turns out that it is pretty easy to fix:

  1. In your SPFx solution, open webpart.manifest.json where webpart is the name of your web part. For example, HelloWorldWebPart.manifest.json for HelloWorldWebPart.
  2. Find the "version" line and replace whatever version you have in there for "*", making it:
"version": "*",

Doing so will cause the version of the webpart.manifest.json to match the package-solution.json version.

(Turns out that the latest version of SPFx documents this by adding the following comment on the line above "version": "*".

// The "*" signifies that the version should be taken from the package.json
"version": "*",

How cool is that?!

Conclusion

By using CHANGELOG.md to keep track of changes between versions, and using semantic versioning for your versions, you can make it pretty easy to document your changes across versions.

By using npm version, you can easily maintain the semantic version of your package.json.

By using Stefan's cool version-sync Gulp command, you can easily sync your package.json version and your package-solution.json.

By using "version": "*", you can synchronize your package-solution.json and your webpart.manifest.json versions.

Finally, by not reinventing the wheel and by leveraging the hard-work of other people, you can do it all with very little effort!

I hope this helps you?!

This is an easy one, but I keep Googling it.

When you create an SPFx web part, the default Property Pane automatically submits changes to the web part. There is no "Apply" button.

Property Pane without Apply
Default property pane -- no Apply button
But sometimes you don't want changes to the property pane fields to automatically apply.

All you have to do is to add this method in your web part class (just before

getPropertyPaneConfiguration is where I like to place it):
protected get disableReactivePropertyChanges(): boolean {
	return true;
}

When you refresh the web part, your property pane will sport a fancy Apply button!

 

PropertyPaneWithApply.png
Property pane with an Apply button

Property changes in the property pane will only get applied when users hit Apply.

 

That's it!

Hub sites?

Unless you're a SharePoint geek like me, you may not have been eagerly waiting for this new feature announced at Ignite 2017 in Orlando. Hub sites are a special site template that allows you to logically group team sites and communication sites under another site, with a shared navigation, theme, and logo.

Hub sites will also aggregate news and activities from any sites associated to it, and you can search within a scope of a hub site and it's associated sites.

The picture Microsoft used in their announcement explains it best:

hubbahubba

The Problem

The typical corporate intranet is often nothing more than a re-hash of the company's corporate organization structure, blindly copied to a web site accessible to employees. If that intranet is done using SharePoint or Office 365, it'll consist of a bunch of site collections with some sub-sites.

(By the way, I completely disagree with using the org chart for your intranet structure, but I'll save it for another blog post).

What happens when your company restructures for (insert official reason here)? Let's say that you had a whole bunch of Divisions, each with their own site (or site collection) and they completely change the divisions every quarter (like the CEO of a former client of mine liked to do).

What happens when the IT, Finance, and HR team are no longer in the same groups?

You end up having to either:
a) Move sites around, break a lot of people's favourite shortcuts and links; or
b) Leave everything the way it is and give up hope

Or, you could create a structure that doesn't need to change with the org-chart-of-the-week by using a flat structure. Since the new modern sites in Office 365, it is a lot easier to create groups, team sites and communication sites in a rather "flat" structure (every site is created in their own site collection, located under https://yourtenant.sharepoint.com/sites/ or https://yourtenant.sharepoint.com/teams/).

So, now you end up with a flat site structure that doesn't need to change when your information architecture changes again, but there is no easy way to navigate through this flat structure.

You can hack together some sort of global navigation with custom code and/or scripts, but every time someone wants to add a new site, you need to change the code.

The Solution

SharePoint Hub Sites allows you to continue creating a flat structure and logically group sites together in a semi-hierarchical fashion.

There are caveats:

  • As of this writing, you can only have up to 50 hub sites on your tenant.
  • You can add sites to hub sites, but you can't add hub sites to hub sites. And don't get me started about hub sites under hub sites under hub sites.
  • You need to be a SharePoint admin to create hub sites, but you can control who can add sites to what hub sites.
  • You'll need to do some Powershell.

Demonstration

We are going to create an Employee Matters hub, which will be the go-to place for employees to find resources related to being an employee of [XYZ Corp].

It will contain the following sites:

  • Benefits
  • Jobs
  • Training

Before you start

Download and install the latest SharePoint Online Management Shell.

Create "Sub" Communication Sites

  1. From your Office 365 environment, create a Communication site by going to the waffle
    waffle
    | SharePoint | Create site.
    createsite1
  2. From the Create site panel, select Communication site. It also works with Team sites.create site 2
  3. Choose the Topic layout and name the site Benefits. Give it a description if you'd like. Select Finish.
    Createsite3
  4. Repeat steps 1-3 above with Jobs and Training (or anything else you'd like to do), making sure to remember the url of every site you create (you'll need to go back to the sites you just created later).

Create a (future) hub site

Repeat steps 1-3 above again, but this time call the site Employee Matters. This will be the site that will be converted to a hub site. Make note of the site's url.

Register the hub site

  1. Start the SharePoint Online Management Shell.
    SPOMS
  2. From the PowerShell command prompt, type:
    Connect-SPOService -url https://-admin.sharepoint.com

    where is your own SharePoint tenant. Note that we're connecting to the Admin site, not the regular .sharepoint.com site.

  3. Once connected (you'll be prompted to login, probably), type:
    Register-SPOHubSite -site https://.sharepoint.com/sites/employeematters

    ...making sure to use the url of the Employee Matters you created earlier. Note that this time, we are not using the -admin.sharepoint.com domain, just the regular .sharepoint.com domain.

  4. If all goes well, you'll get something like this:
    ID : 2be153d3-0fe8-4fb8-8fa0-b41dfdd8bd3f
    Title : Employee Matters
    SiteId : 2be153d3-0fe8-4fb8-8fa0-b41dfdd8bd3f
    SiteUrl : https://.sharepoint.com/sites/EmployeeMatters
    LogoUrl :
    Description :
    Permissions :
  5. Memorize the GUIDs. Just kidding! You can pretty much ignore the response -- as long as it didn't start spewing red text, you're doing fine.

At this point, if you got an error saying Register-SPOHubSite is not a valid command, you probably haven't installed the latest version of the SharePoint Online Management Shell.

If it gives you an error saying that hub sites aren't yet supported, go have a big nap and try again tomorrow.

You can go visit your newly created hub site. It should look like this:
employeematters1.png

It doesn't look much different than any other communication site, but it has an extra navigation bit at the top:

hubsite2

If your site hasn't updated yet, wait a little bit. Some of the changes take up to 2 hours, but every time I have done this, it was instant.

Optional: Set your hub site icon and description

You don't have to do this, but it is generally a good idea to label your sites and give them a custom icon. To do so:

  1. Upload an icon of your choice to a library of your choice (for this demo, I created a document library called Site Assets in the Employee Matters site). Make note of the url to the icon. The icon should be 64x64 pixels.
  2. From the SharePoint Online Management Shell thingy, enter the following:
    Set-SPOHubSite -Identity https://.sharepoint.com/sites/employeematters -LogoUrl https://.sharepoint.com/sites/employeematters/site%20assets/employeemattersicon.png -Description "Find resources for employees"

    Making sure to replace the LogoUrl for the url to the icon you want (and making sure that you put whatever description you want for the site hub).

  3. Your site hub will eventually get updated. Go take a look.

By the way, there is a user interface to change the site hub logo, but there isn't one to change the description. You can get to it by following these steps:

  1. Using your browser, go to your site hub.
  2. From the site hub home page, select the settings gear and select Hub site settings
    hubsite3.png
  3. From the Edit hub site settings pane that appears, you can change the icon or the site hub title. Not the description.
    hubsite4
  4. Select Save and your changes will (eventually) be reflected.

Associate "sub" sites to hub site using your browser

  1. Go to the Benefits site you created what seems like a million years ago.
  2. From the settings gear icon, select Site information
    sitesettings1
  3. From Edit site information pane that appears, select the Employee Matters hub site from the Hub site association, then select Save.
    Note thasitesettings2Note that, in real life, only users who have been granted the rights to join a site will be able to do this -- but that's another blog post. Also, note that changing the hub site will change the site theme to match the hub site and add its navigation (as is clearly indicated on the Edit site information pane).

You should notice that your Benefits site will now have the Employee Matters navigation added at the top. That means it worked.

Associate "sub" site to hub site using PowerShell

  1. From the SharePoint Online Management Shell, enter the following:
    Add-SPOHubSiteAssociation -Site https://.sharepoint.com/sites/Jobs -HubSite https://.sharepoint.com/sites/EmployeeMatters

It will associate the Jobs site to the Employee Matters hub. Note that the -Site parameter is the site you want to add to the hub site, while the -HubSite parameter is the hub site.

Use either the PowerShell method or the browser method to add the Training site to the hub site.

Add links to the hub site navigation

The sites associated to your hub site now sport the new fancy hub site navigation, showing Employee Matters, but you'll notice that the navigation did not get updated to show the newly associated sites.

To fix this:

  1. Go to your hub site's home page. You can do so by clicking on Employee Matters from any of your associated sites.
  2. From the hub navigation (top left corner of the hub site, where it says Employee Matters) select Edit.
  3. From the navigation editing pane that appears, select the + button to add a new link.
    fancyplus
  4. In the Add a link pop-up that appears, enter the url to the Jobs site in the Address field, and type in Jobs for the Display name, then select OK.addlink
  5. Repeat until you have added Jobs, Benefits, and Training then hit Save.hubsitenav

Your hub navigation will contain links to each associated site.

News, activities and search results from the hub home will include results from all associated sites, provided that the current user has permissions to each site. It takes a while before the results appear, but they will!

Conclusion

Hub sites are going to be a great addition to SharePoint in Office 365. They aren't going to solve every navigation issues, but they are certainly a step in the right direction.

There is still a lot to cover with theming and security, but that's probably enough for today.

(OR: How to solve the "this property cannot be set after writing has started." error when calling OpenBinaryDirect)

The Problem

I was trying to write a little app to programmatically download files from a SharePoint instance on Office 365 to a local folder on my hard-drive/network file share -- something I've probably done a thousand times -- using this code:

/*
* This code assumes you already have filled the following variables
* earlier in the code
* Code has been simplified for 
*/
var webUrl = "https://yourtenantgoeshere.sharepoint.com/site/yoursitename";
var username = "yourusernamegoeshere@yourtenantgoeshere.com";
var password = "pleasedonteverwriteyourpasswordincode";
var listTitle = "yourdocumentlibrarytitle";
var destinationFolder = @"C:temp";

var securePassword = new SecureString();
//Convert string to secure string
foreach (char c in password) {
    securePassword.AppendChar(c);
}
securePassword.MakeReadOnly();

using (var context = new ClientContext(webUrl))
{
    // Connect using credentials -- use the approach that suits you
    context.Credentials = new SharePointOnlineCredentials(userName, securePassword);

    // Get a reference to the SharePoint site
    var web = context.Web;

    // Get a reference to the document library
    var list = context.Web.Lists.GetByTitle(listTitle);

    // Get the list of files you want to export. I'm using a query
    // to find all files where the "Status" column is marked as "Approved"
    var camlQuery = new CamlQuery
    {
        ViewXml = @"
            Approved
            1000
        "
    };

    // Retrieve the items matching the query
    var items = list.GetItems(camlQuery);

    // Make sure to load the File in the context otherwise you won't go far
    context.Load(items, items2 => items2.IncludeWithDefaultProperties
        (item => item.DisplayName, item => item.File));

    // Execute the query and actually populate the results
    context.ExecuteQuery();

    // Iterate through every file returned and save them
    foreach (var item in items)
    {
        // THIS IS THE LINE THAT CAUSES ISSUES!!!!!!!!
        using (FileInformation fileInfo = Microsoft.SharePoint.Client.File.OpenBinaryDirect(context, item.File.ServerRelativeUrl))
        {
	    // Combine destination folder with filename -- don't concatenate
            // it's just wrong!
            var filePath = Path.Combine(destinationFolder, item.File.Name);

            // Erase existing files, cause that's how I roll
            if (System.IO.File.Exists(filePath))
            {
                System.IO.File.Delete(filePath);
            }

            // Create the file
            using (var fileStream = System.IO.File.Create(filePath))
            {
                fileInfo.Stream.CopyTo(fileStream);
            }
        }
    }
}

The "usings" at the top of the file were:

using System;
using System.Collections.Generic;
using System.Security;
using Microsoft.SharePoint.Client;
using System.IO;

And every time I ran the code, I'd get a really annoying error on the OpenBinaryDirect method:

this property cannot be set after writing has started.

If I wasn't already bald, I would be after searching everywhere how to solve it.

The Solution

As it turns out, when I created my console application, I followed these steps:

  1. Launch Visual Studio
  2. File | New Project... | Console Application and saved the project
  3. On the newly created project, added Microsoft.SharePoint.Client references by right-clicking on the project's References and selecting Manage Nuget Packages and selecting the first nuget reference that had Microsoft.SharePoint.Client that looked semi-official -- you know, the one that says "by Microsoft"

Wrote the code and quickly ran into the aforementioned error.

As it turns out, I needed to use the Nuget package that said Microsoft.SharePointOnline.CSOM (also by Microsoft).

I removed the Microsoft.SharePoint.Client Nuget package and added Microsoft.SharePointOnline.CSOM instead. It automatically included the right Microsoft.SharePoint.Client and Microsoft.SharePoint.Client.RunTime dependencies it needed.

After recompiling, it worked perfectly.

The way it should have done several hours ago.

After a lot of cursing, mostly directed at myself, I decided to write this down as a #NoteToSelf. Next time I run into this issue, at least I'll find a blog entry describing the solution.

My own.

In my previous article, I discuss best practices on how to choose high resolution photos to use in user profile pictures for Office 365.

You can upload user profile pictures using the Office 365 Admin Center. It may be obvious to everyone else, but I didn’t know this was possible until a very astute coop student showed me this feature (after I spent an afternoon telling him the only way to do this was to use PowerShell). So, to save you the embarrassment, here is the web-based method:

  1. From the Office 365 Admin Center (https://portal.office.com) go to Admin then Exchange.
  2. In the Exchange Admin Center click on your profile picture and select Another User…. from the drop-down menu that appears.
    image
  3. The system will pop-up a window listing users in your Office 365 subscription. Search for the user you wish to change and click OK.
    image
  4. The system will pop-up the user’s profile, indicating that you are working on behalf of the user you selected. Scroll all the way to the bottom and select Edit Information…
    image
    image
  5. Another pop-up window (seriously, disable your pop-up blockers if you haven’t done so already) will the editable user profile page, starting with the Photo section. Click on Change
    image
  6. Click on Browse… and select the picture you wish to use.
    image
  7. Click Save to dismiss the window. Close all the pop-ups.

Repeat for all user profiles pictures you wish to upload. If you have Lync open, you should see the results almost immediately.

The profile picture will also be automatically synched with SharePoint user profiles (at least, that has been my experience… please feel free to comment below if you’ve had different results).

While it may be handy to do a few pictures, if you have to update hundreds of user profile pictures, you may want to use the PowerShell method.

In Office 365, you can upload profile pictures for each user‚Äôs contact card. The contact card will appear in Outlook, SharePoint, Lync, Word, Excel, PowerPoint‚Ķ well, in any Office product that displays contact cards ūüôā

Sample Contact Card in Outlook 2013
Sample Contact Card in Outlook 2013

While this isn’t a new concept to Office 2013, and this feature is available in On Premise installations, these articles focus on Office 365.

There are two ways to achieve this:

You’ll find all sorts of confusing information online regarding the dimensions, file size and format restrictions. I found that either of the two methods described in this article will work with almost any file sizes and dimensions.

There are, however, some best practices.

Choose Square Photos

Choose a square image as the source (i.e.: same width and height), otherwise the picture will be cropped when you upload and you may end up with portions of people’s faces being cropped out.

Example of a great picture, wrong shape... (Photo Credit: rubenshito)

Will be automatically cropped to:

Auto-cropped result.

Go for the Max

Lync 2010 supported the ability to view contact photos which were stored as part of the thumbnailPhoto attribute in Active Directory, meaning that pictures could only be 48x48 pixels.

However, Lync 2013 can now store photos in user’s Exchange 2013 mailbox, meaning that it supports images of up to 648x648 pixels.

When you upload a photo to Exchange 2013, it automatically creates 3 versions of the photo:

SizeUsed By
48x48Active Directory thumbnailPhoto attribute
96x96Outlook 2013 Web App
Outlook 2013
Lync Web App
Lync 2013
SharePoint
648x648Lync 2013
Lync Web App

If you only upload a smaller image (e.g.: 48x48), it’ll be scaled to 96x96 and 648x648, resulting in photos that look fuzzy. However, if you upload photos that are already 648x648. The system will automatically generate 48x48 and 96x96 thumbnails for you.

OriginalAuto-Scaled
imageimage image
imageimage image

(Photo Credit: rubenshito)

Note that if you upload a photo to the thumbnailPhoto in Active Directory, the photo will not be updated in Exchange. If you are lazy like me, you probably want to update photos only once.

My recommendation (and Microsoft's) is to use 648x648 pixels, 24-bit JPG images.

Although you can use the web-based GUI to update profile pictures on Office 365, sometimes you need to upload many pictures at once.

This is where PowerShell comes in handy. Here are the instructions to upload high resolution user profile pictures to Office 365 using PowerShell commands:

    1. Launch the PowerShell console using Run as Administrator
      image
    2. In the PowerShell console, provide your Office 365 credentials by typing the following command and hitting Enter:
      $Creds = Get-Credential
    3. You’ll be prompted to enter your credentials. Go ahead, I’ll wait.
    4. Create a PowerShell remote session to Office 365/Exchange by entering the following command and hitting Enter:
               $RemoteSession = New-PSSession -ConfigurationName Microsoft.Exchange
      -ConnectionUri https://outlook.office365.com/powershell-liveid/?proxymethod=rps -Credential $Creds -Authentication Basic
      -AllowRedirection
    5. Initialize the remote session by entering:
               Import-PSSession $RemoteSession
    6. Doing so will import all the required Cmdlets to manage Exchange ‚Äď this is why you don‚Äôt need to install any Exchange PowerShell modules or anything like that.
    7. If you get an error at this time telling you something about script execution not being enabled (or something like that, I never read the actual error message). Enter the following command to enable remotely signed commands:
      Set-ExecutionPolicy RemoteSigned
      

      The above command is only required if you got an error. Some articles may say that you need to set the execution policy to Unrestricted, but ‚Äď being paranoid ‚Äď I prefer to limit the policy to remote signed commands. If you got an error while trying to set the execution policy, it is most likely because you forgot to Run as Administrator as indicated in step 1 above. Tsk tsk, pay attention!
      Once you set the execution policy without an error, try step 5 again.

    8. Once the session has been imported, you’ll have new Cmdlets available. The most important one being Set-UserPhoto. But before you need to call Set-UserPhoto, you need to load the photo you want to use. To do so, call:
      $photo = "pathofyourphoto.jpg"
      

      Making sure to replace pathofyourphoto with the file name for the picture you wish to upload

    9. Now you can set the user’s photo by using the following command:
      Set-UserPhoto -Identity "testuser@xyz.com" -PictureData ([System.IO.File]::ReadAllBytes($photo)) -Confirm:$false

      Making sure to replace testuser@xyz.com with the user id of the profile you wish to change.

    10. Repeat steps 8-9 until all your pictures have been uploaded. One of these days, I’ll write a script to iterate through all the pictures. Let me know in comments below if you need that script.
    11. When done, call
      Remove-PSSession $RemoteSession

 
For your convenience, here is the whole PowerShell script:

$Creds = Get-Credential
$RemoteSession = New-PSSession -ConfigurationName Microsoft.Exchange -ConnectionUri https://outlook.office365.com/powershell-liveid/?proxymethod=rps -Credential $Creds -Authentication Basic ‚ÄďAllowRedirection
Import-PSSession $RemoteSession
$photo = ‚Äúpathofyourphoto.jpg‚ÄĚ
Set-UserPhoto -Identity ‚Äútestuser@xyz.com‚ÄĚ -PictureData ([System.IO.File]::ReadAllBytes($photo)) -Confirm:$false
Remove-PSSession $RemoteSession

If you used the PowerShell script above, you’ll be able to upload 648x648 pixel photos without any issues for you and other users. If you didn’t use this script, but you get the following error:

The remote server returned an error: (413) Request Entity Too Large

...it is most likely because you connected to your remote PowerShell session without setting the proxy method.  Compare the two PowerShell commands:

Works Only with Photos 10Kb or Below
$RemoteSession = New-PSSession -ConfigurationName Microsoft.Exchange -ConnectionUri https://outlook.office365.com/powershell-liveid/ -Credential $Creds -Authentication Basic ‚ÄďAllowRedirection
Works with Photos Greater than 10Kb
$RemoteSession = New-PSSession -ConfigurationName Microsoft.Exchange -ConnectionUri https://outlook.office365.com/powershell-liveid/?proxymethod=rps -Credential $Creds -Authentication Basic ‚ÄďAllowRedirection

I hope the information above helped?

For more information

Set-UserPhoto CmdLet
http://technet.microsoft.com/en-us/library/jj218694.aspx

Configuring the use of high-resolution photos in Microsoft Lync Server 2013
https://technet.microsoft.com/en-us/library/jj688150.aspx