Hey guys!
Today I want to build upon the last post surrounding Intune and certificate services. Once we get the certificates onto the devices the next step is to configure our services to accept certs as a form of authentication.
I want to talk about configuring Exchange Online in this post and some caveats when setting that up.
Per usual I dont want to spell out a guide for everyone. Those can be found in a multitude of places. There is a good one here.
https://blogs.technet.microsoft.com/messagingblogs/2017/02/16/certificate-based-authentication-o365/
What I do want to talk about are some of the gaps that this guide didnt cover.
First thing is the intermediary. In a best practice PKI deployment you should have a Root and an Int certificate. I would publish both certificates and CRLs to Azure AD using the guide above. When you deploy the Int cert be sure you change the powershell from this
Get-AzureADTrustedCertificateAuthority
$Cert=Get-Content -Encoding byte "Location of Root CA CER file"
$New_CA=New-Object -TypeName Microsoft.Open.AzureAD.Model.CertificateAuthorityInformation
$New_CA.AuthorityType=0
$New_CA.TrustedCertificate=$Cert
$New_CA.crlDistributionPoint="CRL Distribution URL"
New-AzureADTrustedCertificateAuthority -CertificateAuthorityInformation $New_CA
To this "$New_CA.AuthorityType=1" this will specify the cert we upload as the Int.
I reccomend putting the Int cert on the devices we deploy as well as accepting it for authentication in AzureAD
The next little gotcha they don't mention is that ADFS certificate based auth goes over a different port. It goes over port 49443 so make sure you aren't blocking that port coming into the WAPS.
And last but not least make sure that you configure ADFS to accept cert based auth. In ADFS 2016 its a little checkbox under authentication types.
Monday, December 18, 2017
Friday, December 8, 2017
Using a Public Certificate For Intune Certificates
Hello Everyone, long time no talk!
Today I want to go over an experience I had with a client setting up Certificate Based Authentication (CBA) to Exchange Online.
To give a brief rundown of how this is accomplished I will put a couple bullett points below
1. Have an Internal PKI
2. Add and NDES Server to your PKI
3. Configure templates
4. Configure cert profiles in Intune
A great guide on how to accomplish this can be found here
https://www.scconfigmgr.com/2016/04/12/prepare-your-environment-for-scep-certificate-enrollment-with-microsoft-intune/
The issue I ran into comes from the use of a trusted public certificate to secure the IIS server and Intune Certificate Connector instead of one from your internal PKI as in the steps Nickolaj provided in his blog.
By default the NDES Server places its own DNS name in certain registry values that it expects the certificate to have. When using a public cert we need to change those values in the registry.
The keys that need to be changed can be found at
After this change was made our SCEP certificates were getting to the devices.
Hope this helps someone out there that may be hung up on this issue.
Until next time.
Today I want to go over an experience I had with a client setting up Certificate Based Authentication (CBA) to Exchange Online.
To give a brief rundown of how this is accomplished I will put a couple bullett points below
1. Have an Internal PKI
2. Add and NDES Server to your PKI
3. Configure templates
4. Configure cert profiles in Intune
A great guide on how to accomplish this can be found here
https://www.scconfigmgr.com/2016/04/12/prepare-your-environment-for-scep-certificate-enrollment-with-microsoft-intune/
The issue I ran into comes from the use of a trusted public certificate to secure the IIS server and Intune Certificate Connector instead of one from your internal PKI as in the steps Nickolaj provided in his blog.
By default the NDES Server places its own DNS name in certain registry values that it expects the certificate to have. When using a public cert we need to change those values in the registry.
The keys that need to be changed can be found at
The values that hold the server name should be changed to the namespace on the public cert. See example below. Client information has been removed but you get the idea.
After this change was made our SCEP certificates were getting to the devices.
Hope this helps someone out there that may be hung up on this issue.
Until next time.
Tuesday, August 1, 2017
Exchange upgrades and forgotten servers
Hello Everyone!
Got a guest post from a colleague of mine today. Erick Purkins is a Microsoft consultant out of the Houston Texas area and he did a write up of a recent issue he saw. Enjoy.
--------------------------------------------------------------------------------------
Got a guest post from a colleague of mine today. Erick Purkins is a Microsoft consultant out of the Houston Texas area and he did a write up of a recent issue he saw. Enjoy.
--------------------------------------------------------------------------------------
I just wanted to share an experience and issue with everyone
this morning.
I am currently working with a customer to upgrade their
Exchange 2010 infrastructure to Exchange 2016. During our discussions, we
talked about correct service pack levels and OS’s required, etc. One thing I
didn’t think to talk about was “FAILED EXCHANGE SERVERS”. Just curious if
anyone brings this up?
This is important because during the installation of their
first 2016 server I received a rather odd error.
It was the
“Update-RmsSharedIdentity -ServerName $RoIeNetBIOSName was run:
"Microsoft.ExchangeData.DataVaIidationException: Database is mandatory on
UserMaiIbox.” That led me to the issue.
Apparently at one time or another
they had an Exchange server go belly up. Instead of fixing the issue they
turned it off and forgot about it, eventually having someone go in and remove
the server through ADSI.
Now normally this wouldn’t have
been much cause for alarm, but after reviewing the error message and a little
google-fu I realized they had no arbitration mailboxes and this was what the
error referred to.
- So How did I fix this issue?
First, I reviewed how to recreate Arbitration mailboxes.
Something I have done before but not in a while.
- OK seems easy, right? Wrong.
Since I had previously run the Exchange 2016 setup it had
ran /prepareAD and updated the schema. So I could not run the Exchange 2010 SP3
with /PrepareAD to recreate the mailboxes.
- Where to next? I guess I’ll have to use the Exchange
2016 Media.
I hope you’ll never have to do this but with Setup.exe there
is a /mode switch which you can use to remove a failed Exchange install. This
is the only way, you cannot remove the install through add/remove programs. The
command looks like this “Setup.exe /mode:uninstall
/iacceptexchangeserverlicenseterms”
Learn more at: https://technet.microsoft.com/library(EXCHG.150)/ms.exch.setupreadiness.InstallWatermark.aspx
or use setup.exe /?
or use setup.exe /?
After successfully removing the Exchange installation I
removed the AD objects associated with Arbitration mailboxes and re-run
Setup.exe /prepareAD. All the correct mailboxes were recreated in the Default
Users container as they are supposed to.
- Now it’s time to enable those mailboxes…
After recreating the mailboxes with the Exchange 2016 Media
I followed what I would normally do and re-enable them through the shell. Okay
new error, WTH? You mean to tell me I can only do this through the Exchange
2016 Shell, but I haven’t gotten a server even installed yet. Now we have a
Chicken or the egg situation.
- Do we try and install Exchange 2016 again? You Feeling
Lucky?
That was the only thing I could think of to do and the
internet was no help with that question. So I ran setup again and prayed to the
Microsoft gods while crossing everything and holding every lucky charm I could
find.
Must have been the lucky rabbits foot, because this time
around we were successful at installing Exchange 2016.
- Moral of the story
Talk to your customers about failed or improperly
decommissioned servers. Double check your arbitration mailboxes prior to any
upgrade. It may just save you a few hours of Google-fu. Also, always keep a
lucky rabbits foot close at hand.
Wednesday, July 19, 2017
Lets Talk About Azure AD Conditional Access and Automatic Device Registration
Let's talk about Azure AD Conditional Access for a second.
Its deceiving, like rob you in the night after you thought you were friends deceiving. I say this for two reasons.
1. In the rules for Conditional Access there is an option that is labeled 'domain joined'. This is misleading. What this is really checking against is if the device is registered within Azure AD and domain joined. If the device is domain joined but not registered then it won't honor the conditional access controls. Registration can happen automatically for domain joined devices once some configuration is done on prem (more on that later).
2. Conditional Access only supports applications that use modern auth. This wouldn't be so big a deal if when you enabled Conditional Access it disabled legacy authentication methods. It doesn't. What this means is all your fancy layered rules can be defeated by someone in China firing up Outlook 2010 and using a compromised account. Don't believe me? Take a look here.
https://docs.microsoft.com/en-us/azure/active-directory/active-directory-conditional-access-supported-apps
Microsoft's suggested fix is to stand up ADFS and use claims rules to block legacy auth....not much of a fix in my opinion.
I want to circle back around to point number 1 and talk about how to do automatic registration of domain joined devices.
Its not my style to just rehash all the steps in another article unless I had some sort of gotcha moment during it. The steps to enabling this feature can be found here
https://docs.microsoft.com/en-us/azure/active-directory/active-directory-conditional-access-automatic-device-registration-setup
What I do want to touch on is some scenarios I had thought about when doing this. First some background info on how the registration works. Windows 10 devices have the logic to join Azure AD baked into the OS. You configure your SCP point and configure ADFS if you have it and you're off to the races.
A little caveat that I found out is that my devices would not sync unless I was also syncing their Computer Account Object. I believe this is due to Windows 10 machines not being tied to a user account when they sync (more on that later). I did some testing and what I saw when I stopped syncing the Computer Objects was that it also removed my registered devices out of Azure AD. That is gotcha #1
Windows 7 devices are a little different. They require a small MSI package to be ran to force a registration since they do not have the baked in logic. What I found when testing these guys is that they are tied to a user, whatever user's login triggers the join gets the device put in their name. This means that only users that are being synced into Azure AD can register. Gotcha #2. If a non synced user tries it will fail silently.
What is interesting though is that if a synced users stops syncing or gets removed from Azure AD the device will remain and not be associated to any user, like the Windows 10 devices.
So huge wall of text. Here's a picture of my devices and the output of dsregcmd /status on a successfully joined Windows 10 machine to make it all better.
See ya later!
Its deceiving, like rob you in the night after you thought you were friends deceiving. I say this for two reasons.
1. In the rules for Conditional Access there is an option that is labeled 'domain joined'. This is misleading. What this is really checking against is if the device is registered within Azure AD and domain joined. If the device is domain joined but not registered then it won't honor the conditional access controls. Registration can happen automatically for domain joined devices once some configuration is done on prem (more on that later).
2. Conditional Access only supports applications that use modern auth. This wouldn't be so big a deal if when you enabled Conditional Access it disabled legacy authentication methods. It doesn't. What this means is all your fancy layered rules can be defeated by someone in China firing up Outlook 2010 and using a compromised account. Don't believe me? Take a look here.
https://docs.microsoft.com/en-us/azure/active-directory/active-directory-conditional-access-supported-apps
Microsoft's suggested fix is to stand up ADFS and use claims rules to block legacy auth....not much of a fix in my opinion.
I want to circle back around to point number 1 and talk about how to do automatic registration of domain joined devices.
Its not my style to just rehash all the steps in another article unless I had some sort of gotcha moment during it. The steps to enabling this feature can be found here
https://docs.microsoft.com/en-us/azure/active-directory/active-directory-conditional-access-automatic-device-registration-setup
What I do want to touch on is some scenarios I had thought about when doing this. First some background info on how the registration works. Windows 10 devices have the logic to join Azure AD baked into the OS. You configure your SCP point and configure ADFS if you have it and you're off to the races.
A little caveat that I found out is that my devices would not sync unless I was also syncing their Computer Account Object. I believe this is due to Windows 10 machines not being tied to a user account when they sync (more on that later). I did some testing and what I saw when I stopped syncing the Computer Objects was that it also removed my registered devices out of Azure AD. That is gotcha #1
Windows 7 devices are a little different. They require a small MSI package to be ran to force a registration since they do not have the baked in logic. What I found when testing these guys is that they are tied to a user, whatever user's login triggers the join gets the device put in their name. This means that only users that are being synced into Azure AD can register. Gotcha #2. If a non synced user tries it will fail silently.
What is interesting though is that if a synced users stops syncing or gets removed from Azure AD the device will remain and not be associated to any user, like the Windows 10 devices.
So huge wall of text. Here's a picture of my devices and the output of dsregcmd /status on a successfully joined Windows 10 machine to make it all better.
See ya later!
Friday, July 7, 2017
Adding Additional OUs to AAD Connect Sync Filter
Hello again internet. Had a quick post I wanted to write up on changing or adding new OUs to your AAD Connect sync filter.
First lets start off with a little background information. After you install AAD Connect by default it runs what is called a 'Delta' sync every 30 minutes. This sync only syncs changes made to objects since the last sync.
The inverse of this is an 'Initial' sync and runs a full sync against all objects regardless if they have been changed or not. This feature is useful for 2 reason.
1. Running an Initial sync is like giving AAD Connect the old turn it off and back on. Doing this can actually alleviate certain sync errors. I don't actually believe this is an intended use or feature of the Initial sync but what works in the field isn't always what works on paper.
2. Running an Initial sync will pick up any changes to your object filtering, such as adding or changing what OUs you are syncing. A regular Delta will not do this.
------------------------------------------------------------------------------
Now that we have that out of the way lets get into how to actually change what OUs you are syncing. The easiest and best way to do this is through the 'Synchronization Service' GUI. If you ever messed with AAD Connects predecessor, Dirsync, then this will look familiar.
Once you are greeted with the console above you want to go to the Connector button across the top ribbon. When you arrive at this page you want to right click the connector with your local domain name and choose properties.
Once inside of the properties you want to drop down to Configure Directory Partitions and then choose the Containers button.
You will then be greeted with a login prompt. Enter admin credentials with the proper permissions, which could vary depending on if you used express or custom setup. Once inside you will see a GUI with your AD layout. A simple check of the box, just like what you did during setup, can remove or add any OU.
Once you have done this you will want to run an Initial sync. You can kick these off from the GUI but its messy. I reccomend using powershell. You can use the command
start-adsyncsynccycle -policytype Initial
Yes that is the word sync in there twice. You can also use this command to do a -policytype Delta switch for those times you want to manually kick off a sync.
Once you run this Initial sync all the objects in your new OUs should start syncing!
If anyone stumbles upon this hope it helps you out some.
First lets start off with a little background information. After you install AAD Connect by default it runs what is called a 'Delta' sync every 30 minutes. This sync only syncs changes made to objects since the last sync.
The inverse of this is an 'Initial' sync and runs a full sync against all objects regardless if they have been changed or not. This feature is useful for 2 reason.
1. Running an Initial sync is like giving AAD Connect the old turn it off and back on. Doing this can actually alleviate certain sync errors. I don't actually believe this is an intended use or feature of the Initial sync but what works in the field isn't always what works on paper.
2. Running an Initial sync will pick up any changes to your object filtering, such as adding or changing what OUs you are syncing. A regular Delta will not do this.
------------------------------------------------------------------------------
Now that we have that out of the way lets get into how to actually change what OUs you are syncing. The easiest and best way to do this is through the 'Synchronization Service' GUI. If you ever messed with AAD Connects predecessor, Dirsync, then this will look familiar.
Once inside of the properties you want to drop down to Configure Directory Partitions and then choose the Containers button.
You will then be greeted with a login prompt. Enter admin credentials with the proper permissions, which could vary depending on if you used express or custom setup. Once inside you will see a GUI with your AD layout. A simple check of the box, just like what you did during setup, can remove or add any OU.
Once you have done this you will want to run an Initial sync. You can kick these off from the GUI but its messy. I reccomend using powershell. You can use the command
start-adsyncsynccycle -policytype Initial
Yes that is the word sync in there twice. You can also use this command to do a -policytype Delta switch for those times you want to manually kick off a sync.
Once you run this Initial sync all the objects in your new OUs should start syncing!
If anyone stumbles upon this hope it helps you out some.
Wednesday, May 24, 2017
Intune NDES Connector
Hello Everyone!
Short post here today.
I have recently been doing a lot more Intune work and ran into a small gotcha that was not documented by Microsoft anywhere.
I am not going to dive into the details of setting up an NDES server or PKI infrastructure, god have mercy on you if you have to do this and dont know how, but what I will do is link you to some good articles.
The official document from MS - Take heed my warning comment and the one from Sassan!
https://docs.microsoft.com/en-us/intune-classic/deploy-use/Configure-certificate-infrastructure-for-scep
My prefered document
https://www.scconfigmgr.com/2016/04/12/prepare-your-environment-for-scep-certificate-enrollment-with-microsoft-intune/
Both very similar documents but the second one is easier to follow and a little more fleshed out in my opinion.
What I want to address today is this part
This is where you create the certificate that the Intune Connector is going to use. What it doesnt tell you is that this connector does not accept certs issued with a template above schema version 2.
See here
So if you are using custom templates and are on more than schema 2 do not copy from that template, use the built in template.
The Intune Connector does not tell you why the install fails, only that it does.
Somtimes I just....
(╯°□°)╯︵ ┻━┻
Short post here today.
I have recently been doing a lot more Intune work and ran into a small gotcha that was not documented by Microsoft anywhere.
I am not going to dive into the details of setting up an NDES server or PKI infrastructure, god have mercy on you if you have to do this and dont know how, but what I will do is link you to some good articles.
The official document from MS - Take heed my warning comment and the one from Sassan!
https://docs.microsoft.com/en-us/intune-classic/deploy-use/Configure-certificate-infrastructure-for-scep
My prefered document
https://www.scconfigmgr.com/2016/04/12/prepare-your-environment-for-scep-certificate-enrollment-with-microsoft-intune/
Both very similar documents but the second one is easier to follow and a little more fleshed out in my opinion.
What I want to address today is this part
This is where you create the certificate that the Intune Connector is going to use. What it doesnt tell you is that this connector does not accept certs issued with a template above schema version 2.
See here
So if you are using custom templates and are on more than schema 2 do not copy from that template, use the built in template.
The Intune Connector does not tell you why the install fails, only that it does.
Somtimes I just....
(╯°□°)╯︵ ┻━┻
Monday, April 10, 2017
AAD Connect Service Account Changes
Hows it going everyone.
Had an interesting conversation with another engineer today
about the service account that AAD Connect is using. Normally if you do not
specify a service account it should create an account for you named AAD_Junk, assuming you have the proper permissions.
A lot of time was spent today trying to figure out why a good and working install of AAD Connect did not have the expected user account, and maybe this is my rookie showing, but the service account was running under NTSERVICE\ADSYNC.
A lot of time was spent today trying to figure out why a good and working install of AAD Connect did not have the expected user account, and maybe this is my rookie showing, but the service account was running under NTSERVICE\ADSYNC.
This didn’t seem right to me as I was expecting the AAD_
account so rereading the documentation I found out that Microsoft changed the
default service account AAD Connect uses in April 2017.
It appears all new versions will default to using the Virtual Service Account method.
It appears all new versions will default to using the Virtual Service Account method.
The relevant documentation can be found here
https://docs.microsoft.com/en-us/azure/active-directory/connect/active-directory-aadconnect-accounts-permissions#create-the-ad-ds-account
Friday, March 17, 2017
Migration Endpoint Auth Failures
Hello again everyone.
Just want to talk about a little gotcha that might occur during your migrations.
You on premise migration endpoint requires on premise credentials to access the environment. I recently changed my admin credentials in my lab and started running into this error when trying to move some mailboxes.
Real descriptive Microsoft. Thanks for that, what would we do without you.
In all seriousness you can receive a more detailed error by using powershell as seen below.
You have to specify an endpoint as you can see. If you do not know your endpoint name you can just do a get-migrationendpoint.
Now that you know you have an auth problem you can fix this by going into the Exchange Online EAC then navigate to Recipients on the left hand side > Migration tab across the top > then the ellipses (the ... button).
Once there you can double click on the endpoint and in the next window you should see a field that says Associated Administrator. Right next to it, very sneakily placed, is the blue Update link we want.
Once we choose that a field to update the username and password will appear and we can enter known good credentials.
Hope this helps someone find what they are looking for sooner!
Thanks again everyone.
Just want to talk about a little gotcha that might occur during your migrations.
You on premise migration endpoint requires on premise credentials to access the environment. I recently changed my admin credentials in my lab and started running into this error when trying to move some mailboxes.
Real descriptive Microsoft. Thanks for that, what would we do without you.
In all seriousness you can receive a more detailed error by using powershell as seen below.
You have to specify an endpoint as you can see. If you do not know your endpoint name you can just do a get-migrationendpoint.
Now that you know you have an auth problem you can fix this by going into the Exchange Online EAC then navigate to Recipients on the left hand side > Migration tab across the top > then the ellipses (the ... button).
Once there you can double click on the endpoint and in the next window you should see a field that says Associated Administrator. Right next to it, very sneakily placed, is the blue Update link we want.
Once we choose that a field to update the username and password will appear and we can enter known good credentials.
Hope this helps someone find what they are looking for sooner!
Thanks again everyone.
Friday, February 10, 2017
AAD Connect Error, Encryption Keys Could Not Be Accessed
Hello again,
This installment I have a write up on an error a co-worker encountered in AAD Connect. All credit for write up goes to an engineer by the name Cody Rowe, reposted here with permission. Thanks Cody!
This installment I have a write up on an error a co-worker encountered in AAD Connect. All credit for write up goes to an engineer by the name Cody Rowe, reposted here with permission. Thanks Cody!
Fixing
AAD Connect Service Account Issues
Friday,
February 10, 2017
12:58
Issue: Server had not been restarted for a very long
time. Customer needed to install updated VMWare tools for their backup
solution. On reboot, the ad sync service could not start and gave off
this error.
That led me to a few thoughts after doing some
searching.
·
An update was installed after reboot.
Maybe this broke the service.
·
The local service account used to start the ad
sync service was affected by the reboot.
After coming up empty with repairing, reinstalling and
looking over event viewer longer, I decided to proceed with the assumption that
the local service account had its password reset. I reset the password to
the local service account :
Added the service account to local administrators to be able
to log into the machine with said account. Logged out and logged in as
the service account. Make sure you update the password associated with
the service
Now I went through the process of abandoning the previous
encryption key and adding in a new one. Run miiskmu.exe located in the
Bin folder for the ad sync directory with elevated permissions.
When the operation is complete, stop the ad sync service
since it will automatically start and attempt to create new encryption
keys. Go back to the key management utility and select add new key to key
set. I opted to select re-encrypt as well.
When that finishes start the service and open
powershell. Run the following.
Import-module adsync
Specifically the stopped-extension-dll-exception is what I'm
concerned about. Going to event viewer it appeared that I was getting
password issues with the service accounts.
I reset passwords for all the service accounts (AD for both
forests, Cloud service account for machine) and added the updated passwords to
the connection credentials for all 3 connectors.
After that I ran another delta sync, and everything came
back clean.
Editors Note, From another colleague
To add to that, when you abandon the key, all of the data inside the sync database that was encrypted with the old key is invalid. This includes the passwords for the accounts used by each connector. This is why the last step was required
Editors Note, From another colleague
To add to that, when you abandon the key, all of the data inside the sync database that was encrypted with the old key is invalid. This includes the passwords for the accounts used by each connector. This is why the last step was required
Thursday, January 12, 2017
Sync Any Folder To OneDrive
I recently started using OneDrive more. Mostly because our company made the switch internally to OneDrive for business and I got that sweet, sweet 1TB of storage.
Personally, at home anyways, I am a google drive user. One of the great features about Drives sync client is the ability to sync any folder of your choosing to their cloud.
By default with OneDrive it only syncs what is in the OneDrive folder. This path to the OneDrive folder can be changed but by default it is located at C:\Users\%USERNAME\OneDrive-$Businessname
Now there is no true way to sync custom folders but there is a work around, a way to trick OneDrive by using Junction Links. A Junction Link appears as the full folder structure to the folder it lives inside but is actually a link to another location. In the screenshot below you can see I have linked my whole "My Documents" folder to OneDrive (click for larger image)
To create this junction link is a simple set of CMD commands. Remember to run as admin. You want to run the command as follows
mklink /j "C:\Users\(your user profile)\OneDrive - Business\Name of junction folder" "C:\source folder path"
The Junction Link does not have to have the same name as the source folder. In the screen shot below you can see the command I ran to place "My Documents" inside of OneDrive.
I hope this helps you to unlock some of the potential of OneDrive. Please note that as far as I can tell this is not a supported solution to what appears to be a blatant lack of a modern feature, proceed with caution, please consult your physician before using, do not operate while impaired, additional taxes and fees may apply, etc...etc...
Personally, at home anyways, I am a google drive user. One of the great features about Drives sync client is the ability to sync any folder of your choosing to their cloud.
By default with OneDrive it only syncs what is in the OneDrive folder. This path to the OneDrive folder can be changed but by default it is located at C:\Users\%USERNAME\OneDrive-$Businessname
Now there is no true way to sync custom folders but there is a work around, a way to trick OneDrive by using Junction Links. A Junction Link appears as the full folder structure to the folder it lives inside but is actually a link to another location. In the screenshot below you can see I have linked my whole "My Documents" folder to OneDrive (click for larger image)
To create this junction link is a simple set of CMD commands. Remember to run as admin. You want to run the command as follows
mklink /j "C:\Users\(your user profile)\OneDrive - Business\Name of junction folder" "C:\source folder path"
The Junction Link does not have to have the same name as the source folder. In the screen shot below you can see the command I ran to place "My Documents" inside of OneDrive.
I hope this helps you to unlock some of the potential of OneDrive. Please note that as far as I can tell this is not a supported solution to what appears to be a blatant lack of a modern feature, proceed with caution, please consult your physician before using, do not operate while impaired, additional taxes and fees may apply, etc...etc...
Subscribe to:
Posts (Atom)