Synology has limited Docker availability in the package manager to only some select models.
- 18 series:DS3018xs, DS918+, DS718+, DS218+
- 17 series:FS3017, FS2017, RS18017xs+, RS4017xs+, RS3617xs+, RS3617xs, RS3617RPxs, DS3617xs, DS1817+, DS1517+
- 16 series:RS18016xs+, RS2416+, RS2416RP+, DS916+, DS716+II, DS716+, DS216+II, DS216+
- 15 series:RS815+, RS815RP+, RC18015xs+, DS3615xs, DS2415+, DS1815+, DS1515+, DS415+
- 14 series:RS3614xs+, RS3614xs, RS3614RPxs, RS2414+, RS2414RP+, RS814+, RS814RP+
- 13 series:RS10613xs+, RS3413xs+, DS2413+, DS1813+, DS1513+, DS713+
- 12 series:RS3412xs, RS3412RPxs, RS2212+, RS2212RP+, RS812+, RS812RP+, DS3612xs, DS1812+, DS1512+, DS712+, DS412+
- 11 series:RS3411xs, RS3411RPxs, RS2211+, RS2211RP+, DS3611xs, DS2411+, DS1511+, DS411+II, DS411+
- *10 series:RS810+, RS810RP+, DS1010+, DS710+
You can get it working on other Synology NAS by downloading the package direct from their server and doing a manual install.
I have tested this and it is working on my Synology DS416play
Or you can check if there is a newer version available in the directory tree:
Installation is basic, since there is a “Manual Install” function in the Package Center.
Launch Package Center
Click “Manual Install”
Upload the docker .spk you downloaded from above.
Application is running
Trying to run a container
Feel free to comment to let me know if this works or not on your unsupported Synology NAS.
Arman Khalili has spent much of his career building data center businesses on the West Coast. Khalili has begun a new chapter in that journey with the relaunch of Evocative, an established colocation brand in the Bay Area for many years, which was acquired from 365 Data Centers in April.
The new Evocative Inc. is already growing across California. After starting with data centers in Emeryville and San Jose, Evocative has expanded into Los Angeles, leasing 42,000 square feet of space at 1200 West 7th Street.
As industry consolidation creates huge data center companies. Khalili believes there’s lots of room for nimble players targeting local markets
“I think there’s a phenomenal opportunity for independent players to come in and work closely with customers,” said Khalili, the CEO of Evocative. “There’s no such thing as ‘one size fits all.’”
Khalili is a veteran executive and entrepreneur who has launched or led a series of infrastructure companies. Most recently, he was the founder and CEO of data center provider CentralColo (which recently rebranded as Element Critical). Khalili was also the founder and CEO of UnitedLayer, a leading colocation provider in San Francisco, and a co-founder of Sirius, one of the early ISPs in Silicon Valley.
After CentralColo brought in new majority investors last year, Khalili began seeking new ventures. “I still like the business, and came across these assets,” he said.
Evocative was initially founded in a former Colo.com data center in Emeryville, which is located at the eastern end of the Bay Bridge, between Oakland and Berkeley.
Evocative built a solid business selling colocation and managed hosting services, with many customers using less than a full rack. In 2013, 365 Data Centers acquired Evocative and its data center.
Amid robust merger activity, 365 Data Centers decided to seek a buyer. In April, an investment group working with Khalili acquired two of the company’s California facilities, including the former Evocative site in Emeryville, as well as another 365 Data Centers facility in San Jose (534 Stockton Avenue) which had been previously owned by Switch & Data and Equinix. These two facilities were acquired to form the nucleus of the new Evocative, with Khalili as CEO.The remainder of 365 Data Centers was acquired in a separate transaction by a group of investors including Chirisa Investments, Lumerity Capital and Longboat Advisors. The two carrier-neutral facilities span a combined 40,000 square feet of colocation space with the ability to expand to 105,000 square feet, and are positioned to handle flexible lab and high density computing requirements. Evocative is offering a suite of fully customizable data center services including colocation, managed services and public, private and hybrid cloud solutions.
Expanding in Los Angeles
Earlier this month, Evocative said it has acquired two downtown Los Angeles based data centers from ColoNet Solutions and executed a long-term lease with Rising Realty Partners at West 7 Center (previously known as the Garland Center). These facilities add 42,000 square feet and 2.5 megawatts of capacity, with significant expansion space available if needed.>/P>
“This acquisition is in line with our expansion plans and our acquisition earlier this year,” said Khalili. “We look forward to continuing growing the company both organically and through additional acquisitions.
“We really want to be focused on the West Coast,” said Khalili. “In Los Angeles, the amount of supply is constrained and the demand continues. You can’t just build a 100-megawatt data center in Los Angeles, as you can in other places.”
Khalili believes that many small to medium-sized data centers can provide mission-critical services for years to come.
“We’re getting good at retrofitting second-generation data centers and giving them an infrastructure upgrade to make them competitive for the next 10 years,” said Khalili.
The post Evocative Relaunches, Builds Data Center Footprint on West Coast appeared first on Evocative Data Centers.
Today, the MacArthur Foundation announced the finalists for its 100&Change competition, awarding a single organization $100 million to solve one of the world’s biggest problems. The Internet Archive’s Open Libraries project, one of eight semifinalists, did not make the cut to the final round. Today we want congratulate the 100&Change finalists and thank the MacArthur Foundation for inspiring us to think big. For the last 15 months, the Internet Archive team has been building the partnerships that can transform US libraries for the digital age and put millions of ebooks in the hands of more than a billion learners. We’ve collaborated with the world’s top copyright experts to clarify the legal framework for libraries to digitize and lend their collections. And we’ve learned an amazing amount from the leading organizations serving the blind and people with disabilities that impact reading.
To us, that feels like a win.
In the words of MacArthur Managing Director, Cecilia Conrad:
The Internet Archive project will unlock and make accessible bodies of knowledge currently located on library shelves across the country. The proposal for curation, with the selection of books driven not by commercial interests but by intellectual and cultural significance, is exciting. Though the legal theory regarding controlled digital lending has not been tested in the courts, we found the testimony from legal experts compelling. The project has an experienced, thoughtful and passionate team capable of redefining the role of the public library in the 21st Century.
So, the Internet Archive and our partners are continuing to build upon the 100&Change momentum. We are meeting October 11-13 to refine our plans, and we invite interested stakeholders to join us at the Library Leaders Forum. If you are a philanthropist interested in leveraging technology to provide more open access to information—well, we have a project for you.
For 20 years, at the Internet Archive we have passionately pursued one goal: providing universal access to knowledge. But there is almost a century of books missing from our digital shelves, beyond the reach of so many who need them. So we cannot stop. We now have the technology, the partners and the plan to transform library hard copies into digital books and lend them as libraries always have. So all of us building Open Libraries are moving ahead.
Remember: a century ago, Andrew Carnegie funded a vast network of public libraries because he recognized democracy can only exist when citizens have equal access to diverse information. Libraries are more important than ever, welcoming all of society to use their free resources, while respecting readers’ privacy and dignity. Our goal is to build an enduring asset for libraries across this nation, ensuring that all citizens—including our most vulnerable—have equal and unfettered access to knowledge.
Thank you, MacArthur Foundation, for inspiring us to turn that idea into a well thought-out project.
–The Open Libraries Team
With the achievement of Making Our Back End Screamingly Fast, we are shifting the focus of our little team to more UI features.
To learn more about you, our user community, we are asking for a few minutes of your time to fill out a brief survey.
We opened the survey only a few days ago and already have a few dozen responses. So, we’re thinking that we’ll leave it open for a few months, then close it and write all about it.
Please fill out the survey and ask your friends and colleagues to do so as well. The link is available from every page on the Open Hub.
Yeah, but it’s been only around 87 work days. On the other hand, we keep strange hours and are working regularly on Sunday morning to perform upgrades and improvements. We’ve done a lot and would like to share some things with you.
At the end of May, we announced our FIS Ohloh Database Split (FODS) project (About the FODS Architecture), and we said that there was still work to do, and we got busy working on that. Here’s a quick punch list:
- Fixed an issue in SVN where a “REPLACED” file was incorrectly counted
- Did a round of performance improvements on the FODS architecture
- Post FODS, the Ohloh UI Website performance was poor. We brought server response times back to pre-deployment speeds.
- Aggressively hunted down expensive queries and improved them
- Maintained 99.7% test coverage on our Ohloh UI
- Updated our automated Selenium scripts to verify post-FODS functionality
- Improved language support for Ohcount: Grace, AMPL, Shell Script detection, Puppet versus Pascal disambiguation, Objective C detection
- Overhauled our Job Scheduler Logic to better identify Code Locations in need of update
- Leveraged Machine Learning to identify spam accounts
- Fixed issue that caused bloated Ruby processes in production web servers
- Deployed incremental FODS improvements to address DB contention, and back-end process deaths
- Fixed issue that was blocking AnalyzeJobs and updated over 300,000 Project Analyses in a few days
- Added 60,000 Go Code Locations for the Black Duck Knowledge Base
- Added a new Enhanced License feature to better illustrate the rules around a license type (we have more data to populate)
- Even more fixes to job logic, job execution, job death, jobs, jobs, jobs, jobs, jobs!!!!
Our goal over the past few months was to Make Our Back End Screamingly Fast. And we’ve achieved that.
About a year ago, we were coming off of the massive Back-End Background issue, and Project updates were in the double digits per hour. Like “10” and “20”. With the work we’ve done, it’s consistently been 5,000 updated analyses per hour.
We also dealt with the 200,000+ projects that had only enlistments at the long-defunct Google Code forge. We deleted them. We have a plan to search for those projects on GitHub, and if we can find them, we can re-add the project, but it was really important to clear out all those projects that would never be able to be updated again, This let us clear out all the failed jobs related to those projects too.
Up next is Microsoft’s Codeplex. In October, the site will switch to “read-only” mode. In December, the site shuts down. We’ll do the same process — delete the projects and clear out the jobs.
Right now, we’re updating 68% of our projects every 3 days. When we drop the Codeplex projects, which are mostly broken because CodePlex broke their SVN implementation ages ago (Google it; it’s too painful for me to talk about), we expect to push that number over 80%.
Oh, yeah, on May 11, 2017, after I presented at OSCON, we switched the Ohloh UI repository from private to public. So yeah, the Open Hub is OSS. While I was at OSCON, I also had a chance to sit down and chat with the indomitable Randal Schwartz for TWIT’s FLOSS Weekly.
We’ve got more wonderful things planned, so more blog posts coming. As always, thank you for being a member of the Open Source Community and the Open Hub
FOR IMMEDIATE RELEASE
Evocative Announces the Addition of Derek Garnier as Company President
Derek Garnier, experienced data center and internet infrastructure executive, is appointed to executive management team.
San Jose, CA, September 6, 2017 – Evocative, LLC, a leading provider of secure Internet infrastructure solutions, today announced that Derek Garnier has joined the company as President and Chief Operating Officer. Mr. Garnier will drive Evocative’s data center operations including management of the company’s full suite of services, as well as design, delivery, support, and maintenance.
“We are excited to welcome Derek to Evocative. He shares our customer-first mindset, dedication to being a trusted partner to our clients, and focus on delivering innovative solutions that help organizations meet their current technology requirements and realize their future business goals,” said Arman Khalili, Evocative CEO. “Derek and I have worked closely together for more than a decade and I am certain that his deep expertise in data center and Internet infrastructure technology will be a key component in Evocative’s future growth.”
“Evocative has a long history and unparalleled reputation for providing custom Internet infrastructure solutions that allow enterprises in Silicon Valley, the San Francisco Bay Area and now Los Angeles, to run their businesses on their terms,” said Derek Garnier, Evocative’s newly appointed President. “I am privileged to be joining such an outstanding team of professionals, and to be able to build upon Evocative’s solid foundation, helping to drive continued success.”
Prior to joining Evoca tive, Mr. Garnier was the CEO of Layer42, a Silicon Valley-based data center and national network provider, which was acquired by Wave Broadband. There he led the organization for many years, including the sale of the company.
Headquartered in Emeryville, CA, Evocative operates secure, high availability data centers in the heart of Silicon Valley and Los Angeles. The company’s fully customizable data center services include colocation, hybrid IT, managed private cloud, dedicated hosting, and network and security services.
Evocative is a North American company and an owner and operator of secure, compliant, highly available data centers. We are the trusted guardi ans of our clients’ Internet infrastructure. To tour an Evocative data center or receive additional information on data center services, please visit http://www.evocative.com.
The post Press Release – Evocative Announces the Addition of Derek Garnier as Company President appeared first on Evocative Data Centers.
In prior releases of Windows Server, Microsoft shipped basic malware protection through its Windows Defender software. For full protection either System Center Endpoint Protection, or, a third-party antivirus solution was required. With Windows Server 2016, Windows Defender matured into a fully-fledged antivirus solution. It has now been re-branded as Windows Defender Antivirus.
Regardless of whether you choose Windows Defender Antivirus, or, a third-party antivirus solution you need to be sure these products are not scanning critical Exchange components. Microsoft publishes an extensive list of file, folder and, process exclusions to include in your antivirus configuration.
There are eighty-four exclusions in total.
Adding these exclusions are critical to the health and performance of Exchange. Without these exclusions, antivirus software could lock or quarantine files and processes critical to the operation of Exchange.
In this article, we explore how to add the required 84 exclusions to Windows Defender Antivirus. We also have a basic script to automate adding these exclusions for you.
Let’s get started!
Adding Exchange exclusions with PowerShell
Adding 84 exceptions manually through the graphical user interface would be time-consuming, tedious and, prone to human error. This only magnifies the number of Exchange servers we need to deploy. Windows Defender can be managed through multiple methods (such as System Center or Group Policy). However, for this article, we will explore adding the required exclusions using PowerShell.
To add an exclusion via PowerShell we can use the Add-MpPreference cmdlet. For a folder exclusion, we combine this with the -ExclusionPath parameter. For example, a folder exclusion may look like this.
C:> Add-MpPreference -ExclusionPath %SystemRoot%Cluster
A folder exclusion not only excludes the folder and its files but also all sub-folders.
We can also substitute logical paths with environment variables. In the example above, %SystemRoot% is an environment variable that maps to your Windows folder (for example C:Windows).
The Exchange setup program creates an environment variable for your Exchange install path called %ExchangeInstallPath%. For example, if you installed Exchange in the default location this variable would equal C:Program FilesMicrosoftExchange ServerV15. We can use this environment variable once to eliminate 18 of the necessary folder exclusions required by Exchange.
We can also specify multiple folder paths by separating them with commas. For example, to quickly add all the required folder exclusions we can run this one line of code.
C:> Add-MpPreference -ExclusionPath %SystemRoot%Cluster, %SystemDrive%DAGFileShareWitnesses, %ExchangeInstallPath%, "%SystemDrive%inetpubtempIIS Temporary Compressed Files", %SystemRoot%Microsoft.NETFramework64, %SystemRoot%System32Inetsrv
Note: Should a path contain spaces you will need to enclose that path in quotation marks.
Exchange guidance also requires us to exclude various processes. To exclude processes we use the -ExclusionProcess parameter. For example, to exclude all required Exchange processes run the following command. We can separate each process with a comma.
C:> Add-MpPreference -ExclusionProcess ComplianceAuditService.exe, Dsamain.exe, EdgeTransport.exe, fms.exe, hostcontrollerservice.exe, inetinfo.exe, Microsoft.Exchange.AntispamUpdateSvc.exe, Microsoft.Exchange.ContentFilter.Wrapper.exe, Microsoft.Exchange.Diagnostics.Service.exe, Microsoft.Exchange.Directory.TopologyService.exe, Microsoft.Exchange.EdgeCredentialSvc.exe, Microsoft.Exchange.EdgeSyncSvc.exe, Microsoft.Exchange.Imap4.exe, Microsoft.Exchange.Imap4service.exe, Microsoft.Exchange.Notifications.Broker.exe, Microsoft.Exchange.Pop3.exe, Microsoft.Exchange.Pop3service.exe, Microsoft.Exchange.ProtectedServiceHost.exe, Microsoft.Exchange.RPCClientAccess.Service.exe, Microsoft.Exchange.Search.Service.exe, Microsoft.Exchange.Servicehost.exe, Microsoft.Exchange.Store.Service.exe, Microsoft.Exchange.Store.Worker.exe, Microsoft.Exchange.UM.CallRouter.exe, MSExchangeCompliance.exe, MSExchangeDagMgmt.exe, MSExchangeDelivery.exe, MSExchangeFrontendTransport.exe, MSExchangeHMHost.exe, MSExchangeHMWorker.exe, MSExchangeMailboxAssistants.exe, MSExchangeMailboxReplication.exe, MSExchangeRepl.exe, MSExchangeSubmission.exe, MSExchangeTransport.exe, MSExchangeTransportLogSearch.exe, MSExchangeThrottling.exe, Noderunner.exe, OleConverter.exe, ParserServer.exe, Powershell.exe, ScanEngineTest.exe, ScanningProcess.exe, UmService.exe, UmWorkerProcess.exe, UpdateService.exe, W3wp.exe, wsbexchange.exe
Finally, the Exchange documentation also instructs us to exclude certain file types. We can do this with the -ExclusionExtension parameter. For example, to exclude all required file types run the following command. You will notice the extension list covers the database and logs files.
C:> Add-MpPreference -ExclusionExtension .config, .chk, .edb, .jfm, .jrs, .log, .que, .dsc, .txt, .cfg, .grxml, .lzx
Scripting it instead
In the prior section, we saved a lot of time by combining our 84 exclusions into 3 lines of PowerShell code. But we can take this even further. We could combine those three lines into a very basic PowerShell script. Rather than repeat what we have above you can download and check out that example script here.
This is a script in its most rudimentary form. It has no error checking or intelligence behind it. But it can certainly act as a good starting point. I would be curious how you leverage it. Drop me a comment on how you improve it.
Checking our work
You can validate that these exceptions are in place by running Get-MpPreference. For example, to check our folder exclusions we can run the following command.
C:> Get-MpPreference | Select -Expand ExclusionPath %ExchangeInstallPath% %SystemDrive%DAGFileShareWitnesses %SystemDrive%inetpubtemp %SystemDrive%inetpubtempIIS Temporary Compressed Files %SystemRoot%Cluster %SystemRoot%Microsoft.NETFramework64 %SystemRoot%Microsoft.NETFramework64v4.0.30319Temporary ASP.NET Files %SystemRoot%System32Inetsrv
Similarly, you can do this for the extension and process exclusions as well. Just switch out the property after the -Expand parameter. For example, to check all file extensions we specify the property ExclusionExtension. For processes substitute this with ExclusionProcess.
C:> Get-MpPreference | Select -Expand ExclusionExtension .cfg .chk .config .dsc .edb .grxml .jfm .jrs .log .lzx .que .txt
Alternatively, you can also check this from the Windows Defender client itself. From Windows Defender select Settings in the top right. From the Settings screen scroll to the Exclusions section and click Add an exclusion.
From the Add an exclusion screen you can verify all exclusions we added via PowerShell.
Here are some articles I thought you might like:
- Install Exchange 2016 in your lab (7-part series)
- Renew a Certificate in Exchange 2016
- Extend, Prepare and Verify Active Directory for Exchange 2016
- Configure Kemp Load Balancer for Exchange 2016
Drop me a comment if you improve the Windows Defender script. Also, be sure to join the conversation on Twitter @SuperTekBoy.
Special Thanks: MVP Mike Crowley
The post Required Exchange exclusions for Windows Defender Antivirus appeared first on SuperTekBoy.
In case you missed it, four new UI-Router guides have been published this summer.
This guide provides information about how UI-Router handles views, including:
- Views (for a state)
- UI-Views (viewports)
- Nested views
- Multiple named views
- View targeting
This guide describes how UI-Router transitions between application states.
It explains concepts such as:
- What a Transition is
- Transition lifecycle events
- To and from states
- Entered and exited states
- Nested states
- Atomic transition behavior
Transition Hooks are a very powerful feature allowing you to hook into transition lifecycle events:
This guide explains how to:
- Perform asynchronous actions during a transition
- Redirect or abort transitions
- Choose which transitions to hook into
Lazy loading allows an application to be split into smaller chunks.
This can drastically reduce the initial bundle size, allowing apps to load and bootstrap much faster.
This guide explains:
- General purpose lazy loading capability of UI-Router
- Module lazy loading
- Future States
- Framework specific approaches
When a company has implemented Exchange hybrid and has moved some or all their users to Office 365, the question “How do I create a mailbox in Office 365?” frequently comes up.
In this article, we explore how to create a mailbox in Exchange Online when directory synchronization is in place. For this article, we will explore this process using Exchange 2016. We will look at how to complete this task with the GUI and PowerShell. Note that these steps are identical for Exchange 2013.
Using the Exchange Admin Center
This is the simplest and quickest way to create a mailbox in Office 365. The drawback of this solution is that it only allows you to create an entirely new Active Directory user. A preexisting user without a mailbox cannot be enabled for an Office 365 mailbox using the GUI. To grant an existing user an Office 365 mailbox you will need to use PowerShell. Alternatively, that user could be given an on-prem mailbox and then move that mailbox to Office 365.
If your current process is to create a new account in Active Directory first and then enable the mailbox in Exchange second, I would recommend reversing these steps. Using the method below allows you to create a basic user in Active Directory with a mailbox in Office 365. Then you can go back into Active Directory to make any additional changes to the new account, such as group memberships.
For our example, we are going to create a new user called Wilfred Mott who will have a mailbox in Office 365. Wilfred does not currently have a user account in Active Directory so we can use this method. Wilfred’s email will be firstname.lastname@example.org.
From your on-premises Exchange 2016 server, log into the Exchange Admin Center. Select the Recipients tab and Mailboxes sub-tab. Click the New (plus sign) and select Office 365 mailbox.
Note: If you do not see this option you may be missing the required RBAC permissions, or, there is an issue with your hybrid configuration.
Selecting this option walks you through the process of creating a remote mailbox in Office 365. The benefit here is that you do not need to migrate the mailbox after it is created as it already exists as an object in the cloud. Keep in mind that you will not see this mailbox in the Office 365 tenant until directory synchronization has run.
On the New Office 365 Mailbox window type the First name and Last name of the user. As you complete these fields you will notice that the Name field populates combining these values. The name field corresponds to the display name field for the user object in Active Directory. You can alter this field to be a different value than what was suggested.
Click Browse in the Organizational Unit section. This brings up the Select an Organization Unit dialog. From here you can select which Organization Unit (OU) you want the new user account to be created under.
Under User logon name specify a new username for the user. From the drop-down to the right of the @ symbol pick the domain suffix for the user. This builds a User Principal Name (UPN) for the user. Note that the domain name you pick here must be a domain you have validated in Office 365.
Under Mailbox type, pick the type of mailbox you want to create in Office 365. In our example, we are creating a mailbox for user Wilfred Mott so we will pick User Mailbox. Room and Equipment mailboxes are also available if you want to create a resource mailbox in Office 365.
Specify and confirm a password in the New Password and Confirm Password fields. You can also check the box to Require password change on next logon to force the user to create a new password.
By selecting Create an archive mailbox we can also instruct Office 365 to create an archive mailbox for the user in the cloud.
Unlike on-premises mailbox creation, we do not get an option to pick a primary or archive database for our user. In Office 365 we cannot manage databases or servers. In turn, this negates our ability to choose how those users are assigned or distributed across databases. Microsoft makes this choice for us and it is not uncommon for Microsoft to redistribute mailboxes across databases.
If you are looking to create a user and mailbox quickly the minimum fields required are those marked with an asterisk.
For our example, we specified a first name, last name (which populated name field for us), Organizational Unit, username, domain suffix, mailbox type, and password.
With your fields populated click the Save button. Your mail-enabled user will now appear under the Mailboxes tab. The Mailbox Type for Wilfred will be listed as Office 365. On-premises mailboxes are listed as type User. This is a great way to distinguish which mailboxes are on-prem and which are in the cloud.
Keep in mind the user will need to be assigned an Office 365 license before they can access their mailbox.
Note: Your user will not show in Office 365 until directory synchronization completes. How long this takes depends on how your sync cycle is configured. You can force an immediate sync with Azure AD Connect by running the following PowerShell command: Start-ADSyncSyncCycle -PolicyType Delta
Using the Exchange Management Shell
To perform this same task in EMS we use the New-RemoteMailbox cmdlet. Using the above example of Wilfred Mott let’s see what the process would have looked like in PowerShell.
From your on-premises Exchange 2016 server, open the Exchange Management Shell.
First, we will need to capture a temporary password for Wilfred in a variable. The password will be saved as a secure string. To do this enter the following command. Enter a password when prompted.
C:> $password = Read-Host "Enter password" -AsSecureString Enter Password: *********
Next, let’s create the mailbox and parse in the $password variable.
C:> New-RemoteMailbox -Name "Wilfred Mott" -FirstName "Wilfred" -LastName "Mott" -OnPremisesOrganizationalUnit "skaro.local/Whoniverse" -UserPrincipalName "email@example.com" -Password $password -ResetPasswordOnNextLogon:$true
In this cmdlet:
-Name specifies the content of the display name field in Active Directory.
-FirstName specifies the first name of the user.
-LastName specifies the last name of the user.
-OnPremisesOrganizationalUnit specifies where in the on-premises Active Directory the new user account should be created. The New-RemoteMailbox cmdlet tweaks the naming of the parameter from -OrganizationalUnit to -OnPremisesOrganizationUnit to emphasize where the user account exists. In our example, we specified this as the on-premises Whoniverse OU in the Skaro.local domain. This is an optional parameter. If you omit this parameter the user will be created in the default Users OU.
-UserPrincipalName specifies the username in UPN form.
-Password calls the variable named $password from our previous command.
-ResetPasswordOnNextLogon specifies whether a user must change their password the next time they log in.
Create an Office 365 Mailbox for an Existing User
To enable an existing user with an Office 365 mailbox we can use the Enable-RemoteMailbox cmdlet. For example, if we had already created Wilfred in Active Directory Users and Computers we can enable him for an Office 365 mailbox using the following command.
C:> Enable-RemoteMailbox -Identity "Wilfred Mott" -RemoteRoutingAddress "firstname.lastname@example.org
In this command, the -RemoteRoutingAddress parameter specifies Wilfred’s unique SMTP address in Office 365. The domain you use in the routing address is assigned by Microsoft to your Office 365 tenant. The routing address is stamped to Wilfred’s TargetAddress property on his Active Directory account. This instructs our on-prem Exchange to route messages addressed to Wilfred to Office 365. We don’t need to specify a primary SMTP address as this will be generated by our on-prem Email Address Policy (EAP).
Regardless of which command is used the user’s mailbox will not show in Office 365 until directory synchronization performs a sync. Keep in mind the user will need to be assigned an Office 365 license before they can access their mailbox.
Here are some articles I thought you might like:
- Change the notification email for directory synchronization failures
- Access is Denied when enabling Group Writeback
- Easily Connect to Exchange Online with PowerShell
- Easily Connect to Office 365 with PowerShell
What is your preferred way of creating new users in Office 365? Do you create the user on-prem first and then migrate, or, use the method described above. Drop a comment below or join the conversation @SuperTekBoy.