10 Free SharePoint Tools

An edited version of this article ran on Computerworld.com on September 20, 2017. Credit: Computerworld

The SharePoint software vendor ecosystem has produced many free tools that help you administer and configure a SharePoint farm or Office 365 deployment on a day-to-day basis. The vendors would, of course, love to have you upgrade to their for-pay software products, but until you do there is much utility in these free tools. Here are ten that many administrators find extremely useful.

1. Marco Wiedemeyer’s SPDeployment command line tool

Many companies develop solutions that live inside SharePoint, but those SharePoint developers have a hard time deploying those solutions to the right spots within the SharePoint hosting infrastructure, whether that is on premises or up in Office 365. Who wants to keep all of those details straight every time you make a change to a SharePoint Solution? Developer Marco Wiedemeyer has developed a tool that developers can run from the command line that reads a standard JavaScript Object Notation (JSON) file and automatically puts files where they should be and marks properties as they need to be. It handles the credentials of logging into SharePoint as well. You can even use it with multiple sites and environments and trigger deployments as granularly as your developers would need to. It gets installed via the NuGet Package Manager (NPM) and works right from a command console. Hosted on Github. Free.  https://github.com/mwiedemeyer/SPDeployment

2. The SharePoint Online Management Shell

If you have worked with Microsoft server products for any length of time in the past few years, you know that PowerShell is the way to get things done from an administrative perspective. The SharePoint Online Management Shell is a preconfigured PowerShell environment that has called all of the SharePoint Online (Office 365) cmdlets into one safe space; you can do basically anything from here, from creating content packages to migrate file share data to SharePoint Online to creating new document libraries to turning on and off external access to certain SharePoint sites. If you have even the slightest remit to manage Office 365, then you should grab this shell—it is a virtual certainty that having it will make your life easier. Be sure to right-click it after installation and run it as administrator or essentially nothing will work. Runs on your local machine. Free. http://go.microsoft.com/fwlink/?LinkID=617148&clcid=0x409

3. Amrein Engineering Free Microsoft SharePoint Web Parts

Web parts have been around almost as long as the SharePoint product itself, but they have come in and out of “SharePoint fashion” as the product has matured over the years. Still there are many tasks for which a web part—a pluggable component designed to be run within SharePoint—is the best tool for the job. SharePoint 2016 comes with several built in, but a groupware firm has developed around seventy web parts that do everything from read Exchange conference room calendars to track individual stocks to perform overall task rollups across a given team. While your SharePoint developers can build complex solutions that ride atop the server, your users can grab these web parts and build simple pages and project sites themselves. Some of the 70 web parts are free, but all come with evaluation periods, and they are easy to license right from the page. http://www.amrein.com/apps/page.asp?Q=5728

4. ManageEngine’s Free SharePoint Health Monitor Tool

If you’ve not invested in a lot of systems monitoring tools, or you have a smaller SharePoint deployment, you might want a lightweight tool that gives you just an overall rollup of your SharePoint farm’s health status at a glance. The ManageEngine Free SharePoint Health Monitor fits this bill nicely, giving you a convenient dashboard view where you can see details about the CPU, memory and disk space usage for each server running SharePoint.

Then you can drill down into the SharePoint workload itself and see the response time, service status, Web server (Internet Information Services) process details, and even SQL Server details like free pages, cache memory and buffer cache hit ratio. While this tool won’t help you with Office 365 deployments, and it does not appear to be supported for SharePoint 2016 installations, it does indeed work with 2007, 2010, and 2013, which are still widely used. Free; runs locally. https://www.manageengine.com/free-sharepoint-monitor/free-sharepoint-health-monitor-index.html

5. Visual Studio Community Edition 2017

In a land long ago and far away, SharePoint Designer was the preferred tool for non-developers to use to reformat, re-scape, and develop simple SharePoint solutions. Unfortunately, SharePoint Designer is no longer supported and only works with SharePoint 2013 and below. The tool of choice now is the surprisingly good and free Visual Studio Community Edition with the Office Developer Tools extension. The community edition is the free version of Microsoft’s very capable integrated developer environment (IDE), and the Office Developer Tools plug in lights up IntelliSense and debugging capabilities that let you run solutions right on the SharePoint Server itself, remotely in Office 365, or in an Office web app. This tool works with essentially all versions of SharePoint no matter where they are hosted. Free; runs locally. https://www.visualstudio.com/vs/office-tools/#downloadvs

6. Office365Monitor

For those shops with significant deployments in Office 365, it can be really useful to have an eye on how the service is performing. Microsoft has promised at various points over the years more insight into the health of the service overall as well as its individual components, but we frequently see events that do not ever make it to a health dashboard. In the meantime, users blow up your phones asking what’s going on and where their files are. Ex-Microsoft employee Steve Peschka created Office365Monitor as a web service to gain deeper insight into each individual component of Office 365 and its uptime. You plug in the name of your tenant and the tool basically does the rest. There is a generous 90-day free trial and after that it is so inexpensive as to be effectively free. Web service; runs in the cloud; 90-day free trial and then starting from $19 per month. https://office365mon.com/

7. Veeam Explorer for Microsoft SharePoint

Veeam Explorer is basically Windows Explorer (or File Explorer in Windows 10, or Finder on the Mac OS X platform) for SharePoint. It lets you browse the database graphically, use full text search to find documents and items, restore individual items as well as their permissions if they have been backed up, export recovered items back into SharePoint directly or as e-mail attachments, and more.  It is also included in Veeam Backup Free Edition and can be used in conjunction with Veeam Endpoint Backup FREE, which makes this little tool extraordinarily useful. This works with on premises SharePoint 2010, 2013, and 2016 in all editions, but it does not work with Office 365. Free with backup product; standalone 30 day free trial; runs locally. https://www.veeam.com/microsoft-sharepoint-recovery-explorer.html

8. Veeam Backup Free Edition 9.5

Veeam has another really useful tool for those shops with investments in on premises servers. Sometimes your backup solution isn’t aware of SharePoint specifically, or maybe your backup just grabs virtual machines and copies them without doing anything intelligent on the processing side. Veeam’s free backup product is really quite good—I use it myself in my Hyper-V lab—and works with both Hyper-V and VMware. : Picture your SharePoint VM farm: wouldn’t it be nice to clone, copy, export and manage those VMs? Sometimes wouldn’t it be useful to peek inside the VM to restore individual application items? Veeam Backup lets you do this on an unlimited number of ESXi and Hyper-V hosts or VMs. It is totally free and thus a great tool to have in your arsenal as part of a layered SharePoint on premises backup strategy. Free; runs locally. https://www.veeam.com/virtual-machine-backup-solution-free.html

9. Refactored SharePoint SUSHI

SharePoint SUSHI was an open source project hosted on CodePlex that essentially took the most common administrative tasks and put them in one tool. SharePoint SUSHI is a powerful, user-friendly utility enabling you to accomplish common administrative tasks. You can think of SUSHI as a Swiss army knife for SharePoint. While the original version that supported only SharePoint 2007 languishes unloved on the deprecated CodePlex platform, Ivan Sanders, a SharePoint MCT, MCTS, MCITP, MCSE has refactored the tool for use with SharePoint 2013. It is unclear if the tool works with SharePoint 2016, but it does not in any way interface with Office 365.

You can view the lists and sites any given user can access, which is really helpful for looking at effective permissions; upload user photos as profile images; back up and restore sites; apply a theme to a group of sites with one click and much more.

This is a visual studio solution that you download from Github and build yourself, or you can use a precompiled EXE that you can find on GitHub. Free; runs locally. https://github.com/iasanders/sushi/tree/master/Releases/Release%204.0/bin

10. SharePoint Color Palette Tool

If you are not a web designer or artist, then coming up with aesthetically pleasing color palettes can be a real challenge. With SharePoint 2013, 2016, and now Office 365, branding is more possible than ever. Microsoft has a nice little tool to help you create polished, composed color choices. Free, runs locally. https://www.microsoft.com/en-us/download/details.aspx?id=38182





Next Generation Authentication for Windows Shops

An edited version of this story ran on Computerworld.com on September 13, 2017. Credit: Computerworld

Authentication. The act of proving one’s identity to the satisfaction of some central authority. To most, this process means typing in a user name and a password, and it’s been this way for years and years.

But passwords—especially the passwords that most enterprises require that have to be complex, with long strings of numbers and specially cased phrases with some (but not all! Heavens no, not the one you want) symbols—are difficult to remember and often end up getting written down on sticky notes. Then you have to reset them every so often so that hackers and crackers are working towards moving targets. Passwords can be leaked or hacked from the inside as well, as we have seen with numerous credential dump attacks over the past few years. And users can accidentally disclose their passwords if they fall victim to ever increasingly sophisticated phishing attacks.

Luckily for Windows shops, Microsoft is innovating in this space and it has introduced an enterprise quality method of using biometric identification and authentication without requiring the purchase of specialized hardware—and it is baked right into Windows 10, which many shops are already beginning to deploy to replace Windows 7 and Windows 8 and 8.1. In this piece, I want to take a look at this innovation, called Windows Hello for Business, and show how it works and how to enable it to secure your enterprise while eliminating the need for your users to handle cumbersome passwords.

Windows Hello for Business

Windows Hello is the most common and most widely known of the biometric authentication schemes that Windows supports. Windows Hello for Business takes the Hello idea and bundles it with management tools and enforcement techniques that businesses and enterprises want to ensure a uniform security profile and enterprise security posture. Windows Hello for Business uses Group Policy or mobile device management (MDM) policies for management and enforcement, and uses key- and certificate-based authentication in most cloud focused scenarios for maximum protection.

Essentially, Windows Hello acts on two fronts: it can scan one’s fingerprint, and it can also take an infrared picture of a user’s face and perform analysis on it. It pairs these unique physical attributes of each user with cryptographic keys that replace passwords as authentication methods. These keys are stored within specialized security hardware, or are encrypted in software, and unlocked only after Windows deems them authentic. For organizations uninterested in biometrics, Windows Hello also supports PIN usage to replace passwords transmitted over the network.

Windows Hello protects Microsoft accounts (the accounts you use to log in to Microsoft cloud services, Xbox, Office 365, and the like), domain accounts that are part of a corporate Active Directory deployment, domain accounts joined to an Azure Active Directory domain (these are relatively new), and in the future, accounts protected by federated identity providers that will support the Fast ID Online (IDO) 2.0 protocol.

Why is Windows Hello considered stronger than a traditional password? For one, security is always better in threes—the best method is authenticating is to prove something you have, something you know, and something you are. In this case, Windows Hello can authenticate users by satisfying all three rules: something you are (your face, which is exceedingly difficult to copy and use in a malicious way), something you know (the PIN that is used by default by Windows Hello from the point of registration onward), and something you have (your fingerprint, which again without removing digits is difficult to copy and use nefariously).

What is most interesting is that all of these biometrics are stored on the local device only and are NOT centralized into the directory or some other authentication source; this means credential harvesting attacks are no good against Windows Hello-enabled accounts simply because the credentials do not exist in the place that would be hacked. While it is technically possible each device’s trusted platform module, or TPM, could be hacked, an attacker would have to crack each individual user’s machine versus simply executing a successful attack against one machine: a vulnerable domain controller.

The security techniques involved in verifying the biometrics are rigid: special webcams or cameras designed to see in infrared can pick up the differences between a photograph of a person and the real presence of that person, and most laptop manufacturers are now including Hello-compliant cameras in their corporate lines of devices now. You can also purchase these compliant cameras separately, making a staged rollout possible. Fingerprint readers are mature technology and have been around for years, but Microsoft indicates the newest generations of readers pick up more and more on the first swipe, eliminating the need to swipe again and again like some previous models required; essentially all fingerprint readers compatible with any version of Windows can also be used with Windows Hello. It is important to note that you can use both fingerprints and facial cameras or both solutions—whatever biometric you end up using is called the “gesture,” and the gesture action is the key that begins the unlocking of public and private keys and verification of a user’s identity.

The Registration Process

To use Windows Hello, you must register your user account so that Windows can generate the proper elements to replace the traditional password. First, the user configures an account on the device (or the administrator adds a user account to the device). The user authenticates the normal way during the registration process—using a user name and password—and the authentication source, most likely Active Directory, issues its standard yay or nay to that user’s credentials. The user can then enable his or her PIN, which then becomes inextricably linked between that device and that user account.

Windows then generates a pair of keys, a public half and a private half, and stores them both either in the hardware TPM module, or if a device does not have a TPM, it encrypts the keys and stores them in software. This first key is associated with just one biometric “gesture” – either a fingerprint, or a face, or a PIN. Each subsequent gesture has a different protector key that wraps around the authentication key. While the container is designed to only have one authentication key, multiple copies of that single authentication key can be wrapped up with the different protector keys associated with the different gestures registered on the device. There is also an administrative key that Windows automatically generates so that credentials can be reset when necessary, and the TPM has its normal block of data as well that contains attestations and other TPM-related information.

After the PIN is established and these keys are created as I just described, the user can authenticate to the device in a trusted way and Windows will then let him or her create a biometric gesture like register a fingerprint or face print.

Enforcing Windows Hello for Business through Group Policy

As you might imagine, you set up Windows Hello and enforce it throughout the enterprise organization through the use of Group Policy. Within the Group Policy Management Console, you can find policy settings under Policies / Administrative Templates / Windows Components / Windows Hello for Business in both the User configuration and Computer configuration hives. The important policies to configure are:

  • Use Windows Hello for Business: you’ll want to set this to Enabled to get started with the deployment.
  • Use biometrics. Set this to Enabled to enable gestures instead of supporting only a PIN.

Alternatively, if you already have a mobile device management solution deployed, then you can use MDM to force the deployment of Windows Hello. The policies use the PassportForWork configuration service provider, which is like a template of potential settings that you will need to import into the MDM solution before you can begin configuring and enforcing policies.

Key Points to Consider

Some important points to remember:

  • Credentials enrolled in Windows Hello for Business can be bound to individual laptops, desktops, or devices, and the access token one gets after successful credential verification is also limited to that single device.
  • During an account’s registration process, Active Directory, Azure AD, or the Microsoft account service checks and authenticates the validity of the user and associates the Windows Hello public key to a user account. The keys—both the public and private halves—can be generated in the TPM modules versions 1.2 or 2.0 or they can live in software for devices without the right TPM hardware. The Windows Hello gesture does not roam between devices and is not shared with the server; it is stored locally on a device and never leaves the device When the PIN is entered and the face and/or fingerprint is applied, Windows 10 uses the private key stored in the TPM to sign data transmitted to the authentication source.
  • According to Microsoft: “Personal (Microsoft account) and corporate (Active Directory or Azure AD) accounts use a single container for keys. All keys are separated by identity providers’ domains to help ensure user privacy.” In practice, this means that keys get commingled within one secure container, although they are delineated by their native identity provider so that the wrong key is not sent to the wrong provider.

Sidebar: Why a PIN and not a password?

At first blush, a PIN seems like a password but worse: shorter, probably all one type of character (usually numbers), and most likely reused among a number of different places, including bank accounts, access to your mobile phone or tablet, and so on. However, the technical execution of how passwords are verified in the authentication process makes all the difference. Passwords are transmitted over the network to the authentication source where they are validated and either accepted or rejected. Because this transmission happens over the network, anyone with the right tools can snoop in, capture the credentials, and reuse them anywhere. And as we discussed earlier, if all of the passwords are stored centrally, one attack can potentially compromise all of the passwords. In Windows Hello for Business, the PIN is the gatekeeper to unlock a cryptographic key that is bound to the TPM in one individual machine. The PIN only works on the local device and does not enable authentication of any other kind from any other place.

Active Directory Requirements

Fully enabling Windows Hello for Business will most likely require you to add at a minimum one Windows Server 2016 domain controller to your domain. While you do not have to raise your domain or forest functional level, the 2016 DC will light up some required authentication functionality. One alternative to shelling out for a 2016 license is to use Azure Active Directory to deploy Windows Hello.

There is detailed information about exactly what is required from a prerequisite standpoint on the Microsoft website: https://docs.microsoft.com/en-us/windows/access-protection/hello-for-business/hello-manage-in-organization  In particular, pay close attention to the key-based authentication requirements and the certificate-based authentication requirements; if you already have a public key infrastructure deployed in production, the certificate-based authentication method will be much easier to start with. If you are largely cloud oriented, then the key-based authentication method is the one to go with for your first Windows Hello deployments.

The Last Word

Security experts for years have been calling for the death of passwords as we know it, but that prediction has always been troubled by the lack of a seamless, affordable, user friendly alternative to authenticating against systems. In practice, it was always going to take Microsoft putting biometric features inside Windows, the most popular operating system, to spur enough organizations to look into passwordless authentication, and it appears with Windows 10 that the Redmond software giant has done Just Enough to warrant the attention of enterprises everywhere. While it is unlikely your shop is a position to remove passwords entirely, new machines you deploy can work with this option by default, and as you migrate to Windows 10 over time at your own pace, you can slowly but surely work Windows Hello for Business into your security profile.

A Look at the Microsoft Azure Internet of Things (IoT) Solution

An edited version of this story ran on Network World on September 7, 2017. Credit: Network World/Computerworld.

The Internet of Things, a vast network of connected microdevices, sensors, small computers, and more throwing off data every second and sometimes even more often, is all around us. Whereas before we had dumb roads, dumb cities, dumb devices, and the like, now all of those things can talk to us and tell us how they are doing, what their current status is, what their environment is like, and even what sort of other devices it knows are near it—and it chatters all of this back to you. All of this is really coming to a head now because sensors are cheap, their processors are cheap, wireless is everywhere and becoming less expensive, and there are tremendous resources for storage and analytics out there. So how do you deal with this phenomenon and take it by the horns to make it begin working for you?


And deal with that you must, because that is coming to you—data. Cisco research projects that there will be 50 billion connected devices by 2020, and all 50 billion of those will be sending off sensor data between once a second and once a minute. How much data is in each of those payloads? Assume it’s 500 bytes, to allow for some packet overhead: that means 25 terabytes of data being spun off somewhere between every second and every minute. What if Cisco is half wrong? What if Cisco is twice right? One thing is for certain: there will be a tremendous amount of data, and with that data comes projections from Gartner, IDC, and Forrester that show a multi-trillion opportunity in both cost savings and new revenue from the IoT.


One other factor is that IT is starting to fade into the background a bit, at least as a central place  where technology projects begin. Often business units and individual departments are focusing on technology efforts, made possible by the fact that now there are cloud products out there that let you plunk down a corporate purchasing card and get started in minutes. Hire a data scientist contractor, outfit your kit with some sensors, start up a cloud service account, and before long you could have several terabytes of sensor data and be ready to get fired up. Microsoft has been innovating in this particular area with its Azure service, and they have some compelling offerings.


The Microsoft Azure Internet of Things Story
The Azure IoT Suite is designed to be a quick start type of portal—a true example of a platform as a service (PaaS) offering) that gives you the resources necessary to deal with all of the data being sent back to you while also allowing you to manipulate it, develop some level of understanding of it, and use it to either improve your business processes, solve some nagging problem which you have deployed a boatload of sensors to track and mitigate, or build a model of the way certain parts of your assets behave.


NOTE: there is also a newer software as a service (SaaS) offering called Microsoft IoT Central, which takes the platform level stuff away and focuses only on the software that powers the sensors and connects everything together. Manufacturers can build their own SaaS-based IoT solutions hosted on the Azure IoT cloud service and get their solutions to market more quickly without having to reinvent the plumbing, the platform, and more. There’s also the very new (as in spring 2017) Azure IoT Edge suite, which lets programmers develop logic for small computers and sensors on the “edge” of an IoT environment in convenient languages like Java and C# rather than assembly and other more obscure languages. In this story, however, I will focus on the Azure IoT Suite because I think it more clearly highlights the capabilities of the overall platform.


The Azure IoT Suite itself bundles together a bunch of relevant Microsoft Azure services into a surprisingly simplified package. It starts off by allowing you to create a couple of ready made IoT consumption scenarios, including predictive maintenance and remote monitoring, and automatically orchestrates the various Azure products like databases, websites, web services, the data ingestion point, and more, creating and linking them together so that you are ready to go from square one. For example, for the remote monitoring solution that you can start with as a pre-configured package, Azure self-selects and configures the following services, handling the provisioning process automatically:

  • Azure IoT Hub (1 high-frequency unit, also called an S2 unit)
  • Azure Stream Analytics (3 streaming units)
  • Azure DocumentDB (1 S2 instance)
  • Azure Storage (1 GRS standard, 1 LRS standard, 1 RA-GRS standard)
  • Azure App Services (2 S1 instances, 2 P1 instances)
  • Azure Event Hub (1 basic throughput unit)


Each of the other solutions have a different makeup, but you get the idea: everything you need with just a couple of clicks.


The pitch from Microsoft is that while you might have the internal resources to do a couple of IoT style projects today, as you build on those developments, create new models, deploy more sensors, and in general double down on IoT in the future, you probably will not be able to (at least cost effectively) handle all of that data. You will either be forced to invest in expensive storage infrastructure on premises, or you will have to make problematic choices about what data to keep present, what data to roll up, summarize, and archive, and what data to discard. And of course, when you discard data, you cannot get it back, so you might be losing out on some predictive capability you do not know about yet; if you roll up and summarize data, you lose the granularity and resolution on that data necessarily to build some advanced mathematical models and use machine learning.


Instead, you can start right out in the cloud and take advantage of the tremendous scale of resources that Azure already has—and that is growing quite a bit each year. Instead of spending of disks and compute, you just pay for the Azure services and run times that you consume with your project and you can scale up or scale down as your needs change. Even better, Microsoft is starting to make some of the glue work so you can see the day when your Azure IoT data could be integrated within Power BI, for example, so that your regular knowledge workers (as opposed to trained mathematicians and data scientists) could query your data sets using natural language and get results back in a nice, graphical, easy to consume format. All of that glue and linkage would be much harder to create in a on premises environment, and I think Microsoft here is betting that IoT initiatives are new and green enough in most enterprises that it is not difficult to start them off in the cloud—or at least not as difficult as, say, deciding to move SharePoint into the cloud. In fact, right now, the Azure IoT tools integrate with the Cortana Analytics solution, which provides data science, number crunching, and machine learning tools, and you can then inform your business processes of the insights you derive by linking Cortana Analytics with the Microsoft Dynamics suite of enterprise resource planning (ERP) tools.


Imagine this type of scenario: you operate a fleet of large transportation vehicles, each equipped with two or more really expensive engines. These engines can be instrumented with sensors that report quite a bit of information, including fan speed, oil pressure, ambient temperature, fuel pressure, thrust at any given moment, air quality, vibration, and more. You start collecting this data across the thousands of engines that you have in your fleet and pinpointing that data against maintenance records, failure notices, mechanical delays that interrupt the scheduled service you normally deliver with your fleet, and more. Over time and with some math, you are able to build a model that will be able to show that certain engine components are likely to fail after a certain number of cycles. You can learn which components those are, order those parts in advance, and adjust the fleet deployment schedule so that those parts can be replaced when the equipment is nearby, reducing interruptions and saving the cost of ferrying parts all around your locations. This is the kind of model you can build with Azure IoT Suite (and it happens to be one of the sample models you can run as you get started with your account).


As far as the sensors go, last October Microsoft launched its Azure IoT Suite Device Catalog [https://catalog.azureiotsuite.com/], which showcases more than 500 devices from more than 200 partner manufacturers that are all certified to work with the Azure IoT suite. On the developer and software side, the Azure IoT suite is a full scale member of the Azure service, and thusly works with Visual Studio, Eclipse, Chef, Puppet, GitHub, PowerShell, Python, MongoDB, Hadoop, Ruby, docker, MySql, and anything else that is part of the set of compatible offerings and capabilities with Azure.

How It Works

You can get started by heading over to https://www.azureiotsuite.com and logging in with your Microsoft account. There you can either use your current MSDN Azure benefit or fix up a new one, and then you’ll be presented with the Provisioned Solutions page, which is the first page of the Azure IoT Suite package itself. Then, follow these steps.


  1. Click Create a new solution to build your own IoT “workspace.”
  2. You can then choose a couple of different preconfigured solution types, including “connected factory,” “predictive maintenance,” and “remote monitoring.” For this walkthrough, I’ll show you remote monitoring, so click the latter option.
  3. The Create “remote monitoring” solution screen appears. Here is where you enter a friendly name for this workspace, the Azure region in which all of this should be spun up (you would ideally want the region closest to either you or your sensors to reduce latency), and the Azure subscription to which all of this should be billed. You can find pricing information for all of the components of Azure that the IoT suite will provision at https://azure.microsoft.com/en-us/pricing.
  4. Click Create solution, and then grab a cup of coffee while Azure spins up all of the resources it outlined.
  5. After the provisioning is complete, you’ll be back at the Provisioned Solutions screen, and your friendly named workspace will be shown there with a green check mark. Click the Launch button to get inside.
  6. You’ll be greeted with the dashboard screen. This shows a map of the Seattle area with four sensors geoplotted, each with a colored indicator (green or red). These sensors are simulated, just to give you an idea of the type of dashboard you can build with your own sensor deployment. On the right side, you can see the Device to View menu, which gives you a drop down selector where you can pick individual sensors to examine. On the lower left side, there’s the Alarm History section which shows sensors that are meeting some predefined problem threshold, and then on the lower right you see speedometer looking graphs that show various properties and values that the sensor fleet is reporting.
  7. On the left side, click Devices. This gives you a grid-style list of devices. You can ue the “+” button in the lower left to add a new sensor, which can be either another simulated device or a physical device with SIM card (ICC ID) for cellular connection, or access to a wireless connection. You can also modify the properties the simulated sensor displays to the workspace, including model and revision number, asset tag, or anything else you like.
  8. On the left side, click Rules. You can add new rules that operate on the two existing data fields, temperature and humidity, and set the thresholds that apply to those rules. This area is what kicks off those alarm history items on the dashboard, and note that if a device is alarming, its status on the map is changed from green to red to make it easy to identify.


That’s a quick walk around the preconfigured solution, but the key thing to remember is that all of this is live with Azure. You can go and adjust any of this, from the configuration of the dashboard to the way resources talk to each other to anything else; you manage all of this from within the Azure portal, same as any other Azure resource. If you’re looking for a remote monitoring solution just to get started, this solution saves you a lot of effort to get the right pieces in place—start there, tailor it, and build on from there. There’s no additional charge to start here other than the resources the solution spins up to run itself. The design and code is free.


The Last Word

Microsoft has a robust set of tools for integrating all sorts of devices into an IoT solution. They have more scale than you do and work with a wide variety of devices. If you are building an IoT solution, then you owe it to yourself to play around with what Azure IoT can do. I have not seen a solution in this space where it is easier to get started and make your own tweaks to build your own workspace. Worth a look.