Domain Controller in Azure VM with expired password

Came across an interesting situation this morning and thought I would drop the solution I found here incase anyone else needs to figure this out.

Situation:

  • Active Driectory Domain Controller in an Azure VM
  • Your admin account has an expired password
  • RDP’ing to the machine says your password is expired and you need to set a new one, but it keeps prompting you around in the circle that you need to udpate it … but you can’t.

The first thing you will likely try is the Reset Password option in the Azure portal. It doesnt work for Domain Controllers (this changed recently … no idea why). You get an error message that says:

VMAccess Extension does not support Domain Controller

At this point you start trying to figure out if there is another admin account you can use to log in with. In my case this as a dev/test AD box and it only had the one admin account on it.

Solution:

Before you go and delete the VM and build up a new one I found an interesting way to fix this.

Updated 5/20/2021:  The new way to run PowerShell via the admin portal makes things really simple. 

  • Go to the VM in the azure portal
  • Click Run Command in the left hand navigation
  • Choose “RunPowerShellScript” from the options
  • Paste in the following PowerShell (obviously replace the username and password you want to set)

net user <YouAdminUserName> <YourNewPassword>

Then click Run and let the script run for a while and when it is complete your password will be reset correctly.

Old way to do it: 

You can use the azure portal and a VM extension to upload and run a script on the machine to reset the password for you. Here is how you do it.

  • Create a script called “ResetPassword.ps1”
  • Add one line to that script

net user <YouAdminUserName> <YourNewPassword>

  • Go to the VM in the azure portal
  • Go into the extensions menu for that VM
  • In the top mentu pick “Add”
  • Choose the Custom Script extension

  • Click Create
  • Pick your ResetPassword.ps1 script file
  • Ok

Wait for the extension to be deployed and run. After a while you will see the status that looks something like this:

You should be set to RDP into your machine again with the new password you set in the script file.

I have no idea why the reset password functionality in Azure decided to exclude AD DCs … but if you get stuck i hope this helps.

-CJ

Importing and Exporting photos from Office 365 and Active Directory

Photos in Office 365 are a pain. This is because there are currently three(!) main places photos are stored in Office 365:

  • SharePoint Online (SPO
  • Exchange Online (EXO)
  • Azure Active Directory (AAD)

If you are syncing profiles using AD Connect those profiles are being synced into AAD. From there they take a rather arduous and rocky path to EXO and SPO via a number of intermediatary steps that may not work or may only complete after a period of time. e.g. from EXO to SPO photos may only sync if someones photo has not been set directly in SPO and a rather profile property attribute is not set correctly.

Needless to say it can be very hard to work out why you have one photo in AD on-prem, one in AAD, one in EXO and another in SPO … all potentially different!

At Hyperfish we built our product to push photos to all the places they should go when someone updates them. This means that when someone updates their photo and that update is appoved that it will be resized appropriately for each location and then saved into AD, EXO and SPO immediately. This results in the persons photo being the same everywhere at the same time. Happy place!

However, we have some customers who want to bulk move/copy photos around between these different systems to get things back in order all at once. To do this we created a helpful wee commandline utitility that we have published the source code for on GitHub and a precompiled release if you dont want to compile it yourself.

Photo Importer Exporter

Creative naming huh! 🙂

The Photo Importer Exporter is a very basic Windows command line utility that lets you:

  • Export photos from SharePoint Online
  • Export photos from Exchange Online
  • Import photos to Active Directory
  • Import photos to SharePoint Online
  • Import phtoso to Exchange Online

All you do is Export the photos from, say, Exchange Online (the highest resolution location) and it will download them all to the /photos directory. Then you can Import them to, say, Active Directory and the utility will resize them appropriately and save them to AD for you. Simple huh.

Code: https://github.com/Hyperfish/PhotoImporterExporter
Releases: https://github.com/Hyperfish/PhotoImporterExporter/releases

It’s a work in progress and we will be adding bits to it as we come accross other scenarios we want to support.

But for now you can grab the code and take a look, suggest additions, make pull requests if you like, log issues … or just use it 🙂

Happy coding,
-CJ

Tracking planes with Raspberry Pi and Docker

When you use a flight tracking app on your phone to see where a flight is it’s very possible that location data has been crowd sourced. Pretty cool!

Sites like FlightAware.com and FlightRadar24.com use feeds of data from people around the world to help build their datasets. Participating in those feeds is open to anyone who has some basic equipment. This works by listening to the ADS-B and Mode-S signals transmitted by aircraft.  These signals identify aircraft and in some cases include positional data. It’s very easy to listen for these signals using a 1090Mhz antenna and an ADS-B receiver. A couple of years ago I bought some equipment on Amazon, hooked up the software running on a Raspberry Pi and got started feeding it to FlightAware.com. 

However, recently I stepped things up a notch with a new better antenna and dockerizing my setup. More on that in a moment, but first …

Getting started – for those who want to try this themselves

If you want to try this out yourself you will need some basic equipment:

  • Raspberry Pi
  • ADS-B USB stick
  • Antenna

You can buy kits with everything you need from FlightAware on Amazon with everything included.

Once you have your equipment the best place to start is with PiAware – FlightAware’s Raspberry Pi pre-configured software. It walks you through everything needed to get you up and running and feeding their network with your juicy tracking data.

You should  be up and feeding the network in an hour or two:

image

The “good” with the PiAware guide is it’s the simplest build process, the “bad” is that it’s specific to flight aware and doesn’t set you up to feed other providers.

“Need more input!” – Short Circuit (1986)

Eventually you might find yourself wanting more “range”.  The small indoor antenna might let you track aircraft 30mi/50kms away, depending on the terrain around you, line of sight and trees etc…

When you “need more input” you will need a better antenna. This may possibly require some WAF (“wife acceptance factor”) (or HAF, husband acceptance factor) … as it will likely require putting something outdoors and, for best results, on your roof.

I recently upgraded to the FlightAware made outdoor antenna.

Antenna1

My situation called for mounting it externally on the roof along a gutter line.  Ideally I would have mounted it on a peak of the roof, but I didn’t feel comfortable drilling holes in my roof, so opted for a mount that let me hang it from under the eaves.

image

Satellite Under Eave Mount 1 5/8

Here is what the setup looks like mounted.

Antenna2

The new outdoor setup and better antenna really bumped up my coverage.  Even without optimal mounting (as you can see there are trees on the south side of our house) range went from < 50mi to ~150-200mi in some directions.

image

Dockerizing all the things

I like Docker containers. They make my life simple for running different apps and services on one box and it seemed to make sense to me that you should be able to run the piaware software and dump1090 software in containers instead of on the Raspberry Pi directly.

I came across an article “Get eyes in the sky with your Raspberry Pi” by Alex Ellis who had done just that! In Alex’s setup however the configuration of the containers is baked into the docker images at build time which isn’t ideal.  I made some improvements like moving all configuration to Environment variables and added Docker-Compose support.

You can find the code and instructions here: https://github.com/LoungeFlyZ/eyes-in-the-sky

image

With everything in Docker containers it was relatively simple to add a feeder to another tracking site FlightRadar24.com.  They also provide software “fr24feed” that takes a feed from dump1090 and processes/uploads it. You can find optional instructions in the ReadMe file on how to add this pretty simply.

Summary

I love this stuff.  It’s a fun project with hardware and software aspects to it.  Hanging out of a second story window being held by my wife around the waist was a “hilarious” exercise that I suggest every marriage attempts at some point.

I still have some re-wiring to do in the attic to secure the wiring a bit more, and possibly add some more feeders to feed other tracking sites before I’m complete with the project too. 

Going forward I’m not sure what is next for this project yet.  I’m sure there is more to be done and that I’ll likely be mounting more hardware the roof at some point! LOL.

I hope you can enjoy the frivolity of a project like this as much as I do!

-CJ

Sniffing Azure Storage Explorer traffic

A friend asked a question about looking at how Azure Storage Explorer makes its API calls to Azure using something like Fiddler.

The issue with just firing up Fiddler and watching traffic is that to decrypt HTTPS traffic fiddler installs a root certificate so that SSL is terminated in Fiddler first so that it can show you the decrypted payloads back and forth etc…

That is normally fine with apps that use the standard WinINET libraries etc… to make HTTPS calls (like chrome). However, Azure Storage Explorer an Electron app using NodeJS and doesnt use these. Node also handles root CAs a bit differently and a long story short is that it doesn’t by default trust Fiddlers Root Cert that it installs. This means that HTTPS calls fail with a “unable to verify the first certificate” error.

Setting up Fiddler

First you need to set up Fiddler to decypt HTTPS traffic. You do this in Fiddlers options under Tools > Options > HTTPS.

This will prompt you to install a certificate that Fiddler uses to terminate SSL in Fiddler so it can show you the decrypted traffic.

One You have completed this you need to export the certificate Fiddler installed so that you can set up Storage Explorer with it.

  1. Run MMC.exe
  2. File > Add Remove snap in
  3. Pick Certificates, when prompted choose “Computer account” and “Local computer”
  4. Navigate to Certificates > Trusted Root Certificates > Certificates
  5. Find “DO_NOT_TRUST_FiddlerRoot” certificate
  6. Right Click > All Tasks > Export  
  7. As you go through the wizard choose “Base-64 encoded X.509 (.CER)” for the file format
  8. Save it your desktop or somewhere you will be able to find it later

Setting up Azure Storage Explorer

First up you need to configure Azure Storage Explorer to use Fiddler as a proxy. This is pretty straightforward.

In Storage Explorer go to the Edit -> Configure Proxy menu and add 127.0.0.1 and 8888 (fiddler defaults). Note: Not authentication should be used.

Now Storage Explorer will use Fiddler … however, you will start getting “unable to verify the first certificate” errors as Storage Explorer still doesnt trust the root certificate that fiddler is using for SSL termination.

To add the Fiddler certificate go to the Edit > SSL Certificates > Import Certificates. Pick the .cer file you saved out earlier. Storage Explorer will prompt you to restart in order for these to take effect.

Now when you start using Storage Explorer you should start seeing its traffic in Fiddler and in a readable decrypted state like below.

Now you can navigate around and do various operations and see what and how Azure Storage Explorer is doing it.

Happy Coding.
-CJ

Making perfectly clear ice

Call me weird … but i like clear ice in my drinks for some odd reason. So I went on a mission to learn how to make it at home.

Note: This is amost totally pointless. It tastes the same, is just as cold (i think)… but it looks sweet!

Here is how I do it:

-= Making the Ice =-
1. Boil lots of water
2. Let it cool to room temperature (dont skip this step)
3. Boil it again
4. Let it cool to room temperature (yes, again)
5. Fill a small cooler with the water. Leave about 10cm (4″) of room at the top
6. Take the lid off the cooler or leave the lid open
7. Put the cooler in your freezer
8. Leave it for 24 hours
9. Take the cooler out of the freezer and it will be frozen on the top 8cm (3″) or so
10. Slide a knife down the side of the block and ease it around the block. This will let in air and release the block
11. Take the block out carefully and use a knife to remove sharp bits

-= Cutting it/Shaping it=-
Note: I have not mastered this bit yet

1. get a saw or knife and score the block where you want to cut it.
2. Place knife on score line and knock/hit it with a rubber mallet
3. Hopefully it will break cleanly
4. repeat until you have the rough blocks you want
5. Use a hot fry pan to shape the blocks and make perfect sides of the block

Bask in the glory of your perfectly clear ice.

-CJ

RunAsRadio: Keeping Active Directory Data Up to Date with Chris Johnson

I recently sat down with Richard Campbell on the RunAsRadio podcast to talk about the state of directories, why people profile data is a critical component of SharePoint and Office 365 deployments and how Hyperfish can help organizations with their profile and directory mess.

Listen here: http://www.runasradio.com/Shows/Show/508

Microsoft Teams puts people at the center

Today Microsoft announced the much anticipated competitor to Slack called Microsoft Teams.

At the core Teams is all about lightweight team to collaboration spaces that include in persistent chat, docs, notes, instant message, video calls and more. It really brings together many parts of Office 365 into one experience and is simple and easy for users.

 

Why is this so important for Office 365? Well, in short, Microsoft has had all these pieces of technology for many years but has not brought them together in a cohesive application experience for users …. Until now.

I’m excited about Microsoft Teams! For some time many have marveled at Slacks simplicity and therefore friction free collaborative experience, but winced at having to manage, purchase and integrate yet another application. Teams is baked into Office 365 and it comes for free. Is it as good as Slack? We don’t know yet. It’s only in preview currently and only time will tell. But it’s off to a good start.

Reliance on people profile data is more important than ever

Virtually all the new experiences going into Office 365 have people at the center of them. Microsoft Teams is no different.

People are at the center of Microsoft Teams. However, to get the best experience in Teams, and to make it as compelling and interesting as the demos suggested, you must have great people profile information.

Org charts

Take for example the org chart they showed in the demo. Looks great and gives you all the information you need to know when looking where in the organization someone sits. It’s also interactive and you can navigate around etc…

But 75%+ of companies don’t have the org structure data needed in Office 365 to power this feature!

 

Photos everywhere

Or take the photo thumbnails that show everywhere to help you connect with the person you are IM’ing with and make it more personal.

 

Those photos come from Azure AD and Office 365. If you don’t have your users profile photos in AD then chances are you will be getting a subpar experience in Teams.

Bots and automation

WhoBot was another example of how people profile information is powering new experiences. WhoBot is a chat bot that lets you look up people in the organization based on their name or skills for example. Again, all this is driven from people profile information in Office 365.

So what if you don’t have great profile data?

Most organizations don’t have great data in Active Directory, Azure Active Directory or Office 365 profiles. So don’t worry, you are not alone 🙂

This is why we created Hyperfish to help organizations understand their profile data and then complete or fix the missing or incorrect information.

Our mission is to make experiences like Microsoft Teams, Delve, people search experiences everything they should be with complete and accurate profile data.

If you don’t know what how good you profile data is you can try our free analysis app! It support Active Directory on-premises as well as Azure Active Directory / Office 365.

Try it out here: https://app.hyperfish.com

If you want to fix your data you can have Hyperfish reach out to users through email or IM to gather it from the folks missing data. e.g. missing photo photos, incorrectly formatted phone numbers, non-standard addresses, non-standard job titles etc… Learn more here about the full Hyperfish product: https://hyperfish.com/

At Hyperfish we are pumped about Teams! The preview looks great and we cant wait to see what else comes as part of it before it is due for release in Q1 of 2017.

We hope you enjoy these new capabilities and get in touch if you need a hand with keeping your people profile data clean and up to date!

-CJ

Top 10 people centric features in Office 365 you are probably missing out on

At Ignite 2016 I did a quick 20 min theatre session in the expo hall on driving better adoption with your Office 365 rollout by leveraging people centric features. A lot of organizations dont make it part of their plan to ensure they have the right people profile data set up to make the best use of the ever expanding range of experiences in Office 365 that rely on it.

They are letting their users down, IT down and the business down.

Successful adoption means focusing on users

In Office 365 most of the profile data comes from Azure Active Directory. Sadly, it’s all too common to see it poorly populated.

As an example, here are some charts from our Hyperfish Directory Analyzer tool that are very common:

Missing this data impacts your Office 365 deployment. Some are simple yet important and others are business criticial.

Here are the Top 10 things I felt are the most common things people are missing out on due to poor profile data that drive better user adoption and use of these tools.

#1 – Delve People Profiles

Microsoft is investing in making the Delve people profile pages THE definative profile page in Office 365 for users. Search etc… will use these. Without great data these look bland and dont provide users the information they need to find and connect with people quickly.

Delve without good data

Delve with good profile data

#2 – Contact cards everywhere

Contact cards pop up in all sorts of places e.g. in Outlook (clicked 10s of millions of times a month no less!) Without great data they make it hard to connect and discover people.

Poor example

Populated contact card

 

 

#3 – New SharePoint experiences

SharePoint is in the middle of a visual makeover in Office 365. More people data drive experiences like the new rich contact panels will be popping up all over the place.

New SharePoint contact popups

#4 – Mobile SharePoint Intranet

Along with better desktop web experiences, SharePoint have released new mobile apps to help you find and connect with people while you are on the move. Great for finding someone’s phone number while out and about.

Mobile people profile

#5 – Office apps

There are plenty of places in Office applciations like Word and Excel where people information pops up. For example while co-authoring a document with someone else. Good profile data makes it a much nicer experience for seeing who is in the document.

Co-Authoring in Word

#6 – Dynamic Distribution Lists in Exchange

Not many people know, but you can create dyanmic distribution lists that include people that have specific words in their profiles. Great way to ensure that all Sales people are in the right DLs!

Dynamic DLs in Exchange online.

#7 – Workflows

One of the most common people want to do in workflows is excalate a task to someone’s manager if they out of the office or take too long to respond. Without good organization heirachy data in profile this isnt possible! This means people have to write more code to look it up from somewhere else e.g. HR system, or replicate the data somewhere else (which will become stale the day after).

Nintex workflow in Office 365

#8 – People Search

Here is a test. Go to your search portal in Office 365, flip to the people search tab and type “Sales”. Did you get great results? Probably not. People search is driven by people profile data and without it search is hard to use and find the right people at the right time.

Rich people search results in Office 365

#9 – Groups

Groups are slowly becoming the new team sites for projects etc… They are rich with people data and conversations. Make sure they sing with decent profiles.

Office 365 Groups

#10 – Skype for Business

Skype for Business uses people profile data for its Search, dialing and in calls. If you dont have good profile data you cant find the right people in search, you cant dial people if they have poorly formatted phone numbers and you dont get to see who you are talking to without nice profile pictures. Help remote users connect with one another by ensuring their experience rocks.

Skype for Business

Summary

These are just a few of the experiences I think most people are missing out on in Office 365 due to poor profile data. I founded Hyperfish to help people whip their people profiles into shape and start making these experiences rock for users thus driving satisfaction, adoption and reuse of the tools you have already bought.

I hope you are getting the most out of your Office 365 investment and not letting something like profile data get in the way of users loving it.

-CJ

Microsoft Graph spanning on-prem and online!

One of the most interesting announcements, at least in my mind, this week from Ignite 2016 was the news that Microsoft is adding support for Microsoft Graph API in hybrid deployments.

This means you can call the Microsoft Graph API like you would normally for Office 365 based mailboxes, for example, but have it actually connect with a mailbox that resides in an on-prem exchange server!

Let that sink in for a moment. That is pretty sweet!

Typically, if you have an application that is not deployed behind the firewall, i.e. inside the orgnaizations network, you couldnt call things like SharePoint or Exchange APIs without doing network gymnastics like VPN, reverse proxy or putting holes in your firewall (yuk).

Now with this hybrid support in the graph you can simply call internet based REST APIs and Microsoft is doing the work of facilitating that communication back to the on-prem resource.

Currently in Preview the graph only supports Mail, Calendar and Contacts in this hybrid configuration right now, however I can only imagine that more support for other endpoints like Users, SharePoint etc… will come at some point.

You also have to have Exchange 2016 CU3 servers deployed on-prem to get this support and sync your AD to Azure AD as authentication is managed this way.

You can read more about these re-reqs here: http://graph.microsoft.io/en-us/docs/overview/hybrid_rest_support

I think this is a huge benefit for those who are looking to build applications or cloud services that connect to data wherever that sits.

Couple of interesting scenarios to think about:

  • Mobile app that works outside the firewall was either not possible or too hard due to connectivity issues.
  • Cloud service web apps that you want to connect to on-prem data
  • Tools and Apps that can now work with data either in Office 365 or on-prem

 

This to me is a huge step in the right direction for Microsoft in their quest to make developers lives better in a hybrid world.

Hybrid is not a transitional state. For many it’s the end state. 
John Ross and Randy Drisgill

I’m looking forward to more endpoints coming online in the months ahead!

-CJ

Hyperfish Directory Analyzer

Exciting news! Last week we released a beta of our free directory analysis product! 

This is the first small step towards a much larger product we have coming later in September at Microsoft’s Ignite conference for helping organizations leverage their investments in SharePoint and Office 365 better. It will ensure organizations have rock solid people profile data in their directories to power experiences like profiles, Delve, search, personalized intranets, automated business process decision making and more.  But more on that in the coming weeks (stay tuned to www.hyperfish.com also).

So what does this directory analysis do?

Simply put, it scans your on-prem Active Directory or online Azure Active Directory and tells you interesting facts and figures about the completeness and accuracy of your users profile data. We send you a report on what we find.

 Screenshot of Report

We have been testing it with our beta users and now we are ready to have others try it out.

It’s a Beta, you might hit a bug and hopefully will have feedback for us! If you do, please reply to the email you get from us with your thoughts.

Try it out here:  https://app.hyperfish.com

We would love any feedback you have! Also feel free to reach out if you hit any issues on twitter @LoungeFlyZ

Thanks!