Quantcast
Channel: Dynamics 365 Blog
Viewing all 678 articles
Browse latest View live

Using Power Platform Dataflows to extract and process data from Business Central – Post 3

$
0
0

This is the third post in a blog series about how you can use Power Platform Dataflows to get data out of Business Central in a form where it can be accessed by other applications.

The first post covered a basic integration between Power BI and Business Central.

The second post introduced Power Platform Dataflows, and data was stored in a hidden Azure storage account (also known as Azure Data Lake Store gen 2). Illustrated like this:

Starting point

In this post, we reconfigure our Power Platform Dataflows to store the data in our Azure storage account. Illustrated like this:

Desired end result

Once the data is in our storage account, we can access it directly and use it for a variety of purposes.

Much of the material in this blog post is based on the documentation for Power Platform Dataflows, in particular these pages: https://docs.microsoft.com/en-us/power-bi/service-dataflows-connect-azure-data-lake-storage-gen2.

 

Prerequisites

We need a few things before we can accomplish the goals:

  • We need an Azure subscription, so you can create the storage account.
  • The Azure subscription and the Power BI subscription should be linked to the same Azure Active Directory (AAD) tenant. Power BI needs to authenticate to our Azure storage account, and they use AAD for this authentication (OAuth). Put a bit simplified, we need to log into https://powerbi.com with the same account (e.g. susan@contoso.com) that we log into https://portal.azure.com with.
  • We need to be an administrator of the Power BI account, so that you can reconfigure the dataflows.

 

Create an Azure storage account (Azure Data Lake Store Gen 2)

The first step is to create an Azure storage account.

A word about terminology; we will create an Azure storage account, but a special type of storage account called Azure Data Lake Store Gen 2. In the documentation for Dataflows, you will often see the latter term. Sometimes it is abbreviated to ADLSg2.

The storage account needs to be in the same Azure region as your Power BI tenant. To determine where your Power BI tenant is located, go to Power BI online and open About Power BI via the ? menu at the top. It will show a picture like this, including the Azure region:

Power BI region

To create the storage account, follow these steps:

  1. Go to the Azure portal at https://portal.azure.com and click Storage accounts in the navigation pane.
  2. Click Add.
  3. Create a new resource group called CdmStorage.
  4. Specify the name of the storage account as cdmstorage1 (or similar the name must be globally unique).
  5. Specify the location of your Power BI tenant.
  6. Leave the remaining settings at their defaults.
  7. Click Next: Advanced at the bottom.
  8. Enable Hierarchical namespace under Data Lake Storage Gen2.
  9. Click Review + Create and click Create.

Now you have the storage account.

 

You also need a so-called file system inside your storage account. You may be familiar with blob containers. A file system is similar to a blob container, but its also different because its a truly hierarchical like the file systems we are used to from PCs.

First, download and install Azure Storage Explorer from https://azure.microsoft.com/features/storage-explorer. Launch Azure Storage Explorer and log in by clicking Add Account in the left-hand ribbon. Click Next. Sign in with your usual account. Now you should be able to see your newly created storage account.

Next step is to create a file system, which must be called powerbi. Azure Storage Explorer still calls it a blob container in the UI, so thats what you should look for.

You should now have this:

ADLSg2 file system: powerbi

 

Give Power BI permission to read/write to the storage account

We want Power BI (really, our dataflows) to be able to read and write to our storage account, so we need to give Power BI the right permissions.

This is done in two steps.

 

Step 1 is to assign the Reader role to Power BI for your storage account. In the Azure portal, select the storage account and click Access control (IAM). Then click Add role assignment:

Role assignments

In the dialog, choose the Reader role, and then search for Power BI Service:

Add role assignment

Select the search result and click Save.

 

Step 2 is to assign read/write permissions to the powerbi file system that we created earlier.

Switch to Azure Storage Explorer again. Right-click the powerbi file system and click Manage Access. You should see a page like this:

Manage access

You need to add some rows to this list, but which? To answer that, we need to look inside Azure Active Directory.

Switch back to the Azure portal and select Azure Active Directory in the navigation pane. Click Enterprise applications, select Application Type = All Applications, and then click Apply. Now enter Power in the search field and you should see something like this:

Find object IDs

Notice the Object IDs. It is those that we need. But beware yours will be different from the ones in the screen shot above.

Now switch back to Azure Storage Explorer and add the three applications, as shown:

Grant Power BI Premium access

And

Grant Power BI Service access

And

Grant Power Query Online access

Click Save.

 

Now we have created the storage account and configured security for it. It was a lot of steps, but fortunately it only has to be done once.

 

Configure Power BI to save dataflows to the new storage account

The next step is to configure Power BI to use the new storage account. Go to Power BI online, click the settings cog in the upper-right corner, and select Admin portal:

Open Power BI Admin Portal

Under Dataflow settings, you should see this screen:

Power BI Admin Portal

The text explains that currently your dataflow data is stored in a storage account that Power BI provides, and which you cant access directly. This is what we used in blog post 2 in this series.

Obviously, we need to connect Power BI to our storage account. So click that yellow button and fill out the fields with your values. These are my values:

Connect to ADLSg2

Click Continue. Power BI now verifies that it can access the storage account. If all goes well, you should see this page:

Allow workspaces to use ADLSg2

Flip the toggle to allow workspace admins to use the storage account:

Allow workspaces to use ADLSg2

Thats it for configuration of Power BI and Azure!

 

Saving dataflows to our own storage account

The workspace that we created in blog post 2 still saves data in Power BIs internal storage account. Lets switch it to save data in our storage account.

Open the workspace settings:

Open workspace settings

And toggle this setting:

Enable ADLSg2 on workspace

Click Save.

Existing dataflows will continue to save to wherever they were saving when they were initially created. So we need to recreate the dataflow in blog post 2. Once you have done that, try to refresh the dataflow a couple of times.

Now switch to Azure Storage Explorer and refresh the powerbi file system. You should see that the data has been saved there!

Files in ADLSg2

Inside the Items.csv.snapshots folder, you will find the actual data in CSV format, one file for each time we refreshed:

a screenshot of a cell phone

Open one of the files in Notepad, and you will see the actual data:

CSV data

 

The described file structure, with a model.json at the root and subfolders for different entities containing the actual data snapshots, is based on the Common Data Model (CDM) format. CDM is a format that is used by more and more services from Microsoft and other companies, and which enables reuse of data and tools in different domains. For example, suppose that you want to analyze the item data in Azure Databricks, this is straightforward because Azure Databricks can read CDM data. You can read more about CDM here: https://docs.microsoft.com/common-data-model/data-lake.

 

Thats it for this blog post! We now have the data in a location that can be accessed from other tools!

In the next blog post, we will look at how you can access the CSV files programmatically.

 

If you have questions, or if you have ideas for future blog posts in this area, I’m happy to discuss further. Feel free to write in the comments section or privately via email (chrishd@microsoft.com).

The post Using Power Platform Dataflows to extract and process data from Business Central – Post 3 appeared first on Dynamics 365 Blog.


Lifecycle Services – September 2019 (Release 2) release notes

$
0
0

The Microsoft Dynamics Lifecycle Services (LCS) team is happy to announce the immediate availability of the release notes for LCS (September 2019, release 2)

Database backup assets limited to Project Owners and Environment Managers

Based on feedback from our customers (that’s you) we have restricted access to the Database backups section of the project asset library. This section can now only be accessed by Project Owners and Environment Managers. This will allow organizations tighter controls on who can upload and download full database backupsfrom their LCS project.

To change a user’s role in your project, simply visit the Project Users page. After changing the role assigned to the user, that user must sign out. Updates to a role will take effect when that user signs in again.

 

Database refresh requiring Platform update 20 in December 2019

As of this writing, the current minimum platform version supported by Database refresh is Platform update 12. We will be increasing this to Platform update 20 in our December 2019 release of Lifecycle Services. This means that both the source and the target environment in a refresh operation will need to be using Platform update 20 or later.

New platforms are released each month, with the current general availability version being Platform update 29. Customers are encouraged to stay on a recent platform update to continue to use valuable features such as Database refresh. Platform update 20 is available in the Shared Asset Library under the Software Deployable Packages section

The post Lifecycle Services – September 2019 (Release 2) release notes appeared first on Dynamics 365 Blog.

Customization INSIDE the System Application in Dynamics 365 Business Central

$
0
0
In Business Central 2019 release wave 2 were introducing a shift in the story around customization. In fact, its a big step toward a future where the term extend has replaced customize.
To make Business Central lighter and easier to maintain and upgrade, were componentizing its platform and business logic in the System Application. If youre interested in what thats all about, see the following post.
In a previous post we looked at how to build new functionality on top of the System Application. But what if you find something that you want to change inside a module in the System Application? Lets see how you can do that and then build your own version of System Application.
Note: Because the System Application is part of Business Central 2019 release wave 2 it isnt yet publicly available. To get access youll need to join the ReadyToGo program so that you can use Microsoft Collaborate and our insider builds.
This example will walk you through the following steps:
  • Get the latest Docker image.
  • Prepare the environment for code customization.
  • Publish and install a new version of System Application.
  • Share your great work with others.

Get the latest Docker image

Start by pulling the latest Docker image (current walkthrough based on build 15.0.34197). To do that, run the following command.

docker pull bcprivate.azurecr.io/bcsandbox-master:base-ltsc2019
After that, we need to create a Docker container. We can use our favorite PowerShell script to do that, we just need to be sure to add the useCleanDatabase parameter. There are several ways to do this, and if youre interested you can find more information in this post. For the sake of this example, however, heres how I do it:
$credential = New-Object System.Management.Automation.PSCredential -argumentList "admin", (ConvertTo-SecureString -String "P@ssword1" -AsPlainText -Force)$imageName = "bcprivate.azurecr.io/bcsandbox-master:base-ltsc2019"$licensePath = "C:\..\l.flf" #put actual path to your license$containerName = "BC"New-BCContainer -accept_eula ` -updateHosts ` -containerName $containerName ` -auth NavUserPassword -Credential $credential ` -imageName $imageName ` -licenseFile $licensePath ` -doNotExportObjectsToText ` -includeAL ` -useCleanDatabase ` -memoryLimit 16g `
The container will start as a process and the output of the function will display in the PowerShell output. Among other parameters we can find the URL for the web client, which well open in a later step. Now we need to replace the standard System Application from Docker image with our own version.
Run the following cmdlet to uninstall and unpublish System Application:
UnPublish-NavContainerApp -containerName $containerName ` -appName "System Application" ` -unInstall ` -doNotSaveData

Prepare the environment for code customization

We now have a blank environment, but we cant use it yet because its missing a few application objects (for example, the default Role Center) that are required.
Lets open VS Code and start enhancing an existing module or building our own version of the System Application.
Note: We need the latest version of the AL extension for VS Code. The PowerShell output contains the link to the .vsix file, so we can download it from the container.
1. In VS Code, run the AL:Go! command to create a new AL Project, and then choose 4.0 as the Target Platform.
Note:The project folder should be in a location that is shared with the container. For example, a folder in C:\ProgramData\NavContainerHelper will work.
2. Update the Server and Server Instance parameters in the launch.json file with values from the PowerShell output.
3. Delete the HelloWorld.al and app.json files.
Now we’re ready to code. Rather than building the System Application from scratch, we can get the latest code from ALAppExtensions repository on GitHub. To do that, well follow these steps:
  1. In GitHub, choose the Clone or Download button, and then Download ZIP.
  2. Open the downloaded archive and copy the content of the \ALAppExtensions-master\Modules\System folder to our AL project.
Now we have the latest version of the System Application and can download symbols and make enhancements.

Publish and install your System Application

When were done, well package the System Application without publishing it.
To publish and install a new version of the System Application, well run the following cmdlet in PowerShell:
Publish-NavContainerApp -containerName $containerName ` -appFile "C:\ProgramData\NavContainerHelper\AL\DemoSolution\Microsoft_System Application_15.0.0.0.app" ` -skipVerification ` -sync ` -syncMode ForceSync ` -install 
Now lets open a web browser and check out how our enhanced System Application works.
Well repeat the UnPublish-NavContainerApp, modify AL, and Publish-NavContainerApp steps until were happy with the results.

Share your improvement

When were done, we may want to share our enhancements with Microsoft and others. For information about how to do that, see this blog post. Note, however, that there is a difference. We will use a container rather than a cloud sandbox. Otherwise, the steps are the same.

The System Application is a work in progress, and new modules will be added in the future. If you want to peek at the latest, you can always go to the GitHub repository https://github.com/Microsoft/alappextensions. While youre there, if you see something you think weve missed, you can submit a pull request and we might add it. Additionally, if you think weve left out a module, you can submit an idea on https://aka.ms/bcideas.

The post Customization INSIDE the System Application in Dynamics 365 Business Central appeared first on Dynamics 365 Blog.

Turn prospects into engaged customers with intelligent sales and marketing

$
0
0

The selling landscape is undergoing fundamental changes, many of them driven by the effects of B2B customers experience as everyday consumers. Many retailers have created personalized, nearly immersive, online experiences for each customer. Consumers shopping for goods and services continually experience fresh and delightful interactions, from highly customized offers and recommendations to frictionless channels to 24/7 interactions.

The impact of B2C on B2B

Todays B2B buyers have high expectations, and those expectations will not be met if B2B buyers are accustomed to sophisticated consumer interactions in their personal lives. Executive B2B buyers are not impressed by marketing driven by large, relatively impersonal data analysis that leads to inconsistent and conflicting interactions or sales outreach that doesnt cater specifically to their needs at the right time.

The source of the problem may be largely invisible to the companies perpetuating this issue. Many organizations believe themselves to be customer-centric, while their buyers may not agree. Thats a significant disconnect. Clearly, B2B has much to learn from B2C companies.

Customer experience the rewards for getting it right

Many B2C organizations have strategically embraced modern technologies like customer data platforms (CDP) and artificial intelligence (AI) to gain a 360-degree view of their customers and follow through on those insights to optimize customer engagement.

The rewards for getting this engagement right are substantial. Many buyers are willing to pay more for a better customer experience. In terms of the potential benefits a great experience can have on sales success, a McKinsey study reported that organizations can expect:

  • 10-15 percent lower customer churn
  • 20-40 percent increase in the win rate of offers
  • Up to 50 percent lower service costs

Take a new approach

B2B companies must move away from their legacy approaches based on large, relatively impersonal data analysis and move to solutions that unify relationship data across the full customer lifecycle. That way, they can gain insights that help build credibility and trust with buyers. They can run multi-channel campaigns to increase sales-ready leads, create personal experiences, and use guided process and AI to anticipate and respond faster to customer needs. They can build the ongoing, high-quality relationships that are necessary for long-term success.

Four principal goals

Turning prospects into engaged customers is a process. In order to achieve these goals, organizations must focus on 4 key priorities:

  • Nurture more demand
  • Personalize buyer experiences
  • Build relationships at scale
  • Make insight-driven decisions

Each of these drives results by using deep reservoirs of data in making technology feel more human.

Nurture more demand

Relying only on conventional, basic email marketing as the primary source of leads is simply not effective enough. In fact, the more focused and demanding the customer universe is, the more essential it is to gain deep insights into what those customers expect. Northrop & Johnson, a leading global yacht brokerage, competes for multi-million dollar customers using technology its industry has been slow to adopt. Using Dynamics 365 for Marketing has created a decided competitive advantage: Vital insights into their customer base have helped to drive a 70 percent increase in charter sales.

In any industry, companies need to generate leads across multiple channels, nurture large numbers of leads while prioritizing each one, and use data-driven insights to deliver leads that are sales-ready. Nurturing more demand is critical to growth.

Dynamics 365 for Marketing helps generate, nurture and prioritize sales-ready leads.

Personalize buyer experiences

Its time to end friction, inconsistencies, and the do you know who I am? part of the customer experience. Companies can acquire a holistic view of buyers, predict buyer intent, and orchestrate a connected, personalized journey for customers.

In an era where guests have more choices than ever for leisure and entertainment, Tivoli delights its guests by using Dynamics 365 Customer Insights to stay one step ahead of expectations and transform the guest experience. With its deeper understanding of guests, it can add new chapters to its long tradition of imagination and innovation.

Dynamics 365 for Marketing enables you to personalize buyer experiences and predict buyer intent.

Build relationships at scale

Mutually beneficial relationships dont simply happen with more data. Companies need to build credibility to establish and grow relationships with customers.

As HP approached its 80th anniversary, the global technology innovator decided to embrace new capabilities that would help it create the sales workforce of the future. HP already has one of the worlds largest implementations ofMicrosoftDynamics 365for Sales. For the next step in itsjourney, HP decided touseMicrosoft Relationship Sales, which combinesDynamics 365 for SaleswithLinkedInSales Navigator.

Within just four months of rolling out the training program, HP noticed it was generating significantly more leads through relationship selling.

Together, Dynamics 365 and LinkedIn enable the company to have increased information about, and impact on the sales relationships that are added to its sales pipeline, even as that pipeline experiences exponential growth month over month.

Dynamics 365 for Marketing helps you build relationships at scale.

Make insight-driven decisions

Heres where sales and marketing can truly align: utilizing data to uncover insights that lead to better-informed decisions throughout the sales process. This can improve performance, empower employees, and enable the company to gain increasingly effective strategic insights.

With more than 1,500 pubs serving guests throughout the UK, Marstons launched a business transition by bringing together guest data that was scattered across multiple systems into Dynamics 365. With their locations guest data now unified, Marstons will gain a complete view of guests, which can be harnessed to generate customer satisfaction and strategic insights. This approach helps drive improved performance throughout the company, including the opportunity to empower employees an often-overlooked aspect of a companys success.

Dynamics 365 for Marketing enables you to make insight-driven decisions to improve performance, empower employees and gain strategic insights.

Aligning sales and marketing: The intelligent way to succeed

Its possible to create exceptional experiences, drive more qualified leads, and increase revenue if an organization has the vision, process, and technology to harness all the data available. This requires high-level technology with well-defined business goals and sales and marketing applications fueled by keen intelligence. We have a compelling offering to accomplish just that with Microsoft Dynamics 365.

You are invited to learn more about what intelligent sales and marketing can do to bring your sales and marketing approach to todays level of competition. Please contact us to speak with a Microsoft expert or Microsoft partner near you.

The post Turn prospects into engaged customers with intelligent sales and marketing appeared first on Dynamics 365 Blog.

Advanced AI topic clustering in Customer Service Insights

$
0
0

As the power ofartificial intelligence (AI)has evolved over the past few years, ithasshownits power to transform businessesto be more productive andtoserve customers better.

Using the power of AI, systems can digest large amountsof data,predict future patterns,and identify the best solutionsquicklywith minimum human effort. On the other hand, system capabilitiesthat rely onAI areusuallyhard to implementas theyrequire people who haveexpertise inboth AI and the business domain.

Thats why we bring youDynamics 365 Customer Service Insights. It focuses on your customer service scenarios andmakes itvery easyfor you to useAIalongside traditional business intelligence. This means you canmake better data-driven decisions, without requiringextensiveAI knowledge.

Using anout-of-the-boxnatural language understanding model, itgeneratesinsightson top ofyoursupportcases byautomatically groupingsimilar cases into topics.In addition tohigh-qualityAI models,it alsocomes withseveral advanced capabilitiesthat work togetherto provideyou withamarketleading solutionfor actionableinsights.

  • AI-discoveredtopicswith automaticoptimizationmakeitmore usablein real-world scenarios
    Thanks to its semantics-embedding model,Customer Service Insightsdoes a much better job togroup similar casesthan simplyplain text matching. Itunderstands thetext semantics whenevaluatingcasesimilarity,sothatcasesusingdifferent terminologiestodescribethe same thing can be grouped together.This capability helps to reducegeneratingduplicated topics.For example,casesthatdescribeissueswith the termspromotional certificate andcoupon code areconsidered asdescribingthesame type of issue.
    Customer Service Insights detects similar text semantics from case titles to reduce topic duplication
    Figure 1: Customer Service Insights detects similar text semantics from case titles to reduce topic duplication

    In addition, each topicisassignedan auto-generatednamethat can represent itsrelevant cases.Once atopicname is assigned, either automaticallyby AIor by users who renametopicslater,Customer Service Insightstrackstopicsin the past to keeptheir namesstableand maintainable overtime.

  • Our AI model continuously improves fromcustomergestures
    Customer Service Insightscomes with a well-tuned AI model. However, we also understand one model cant fit alltypes of yourbusiness and needs.Thats why weaddedfeaturesthat allow you to manageandupdate your topicsbased on your needs. Furthermore, weenabledour AI model withacontinuousimprovement capabilitythat can learn andimproveitselffromgestures, such asusers givingthumbs-upor thumbs-down feedback,renaming topics,ormoving casesto another topic. With continuous use,youwill end up gettinganexclusiveAI model thatworks betterinyourspecific business scenario.

    One way to improve AI mode is to provide thumb-up/down feedback to indicate if cases belong to a topic
    Figure 2: One way to improve AI mode is to provide thumbs-up or thumbs-down feedback to indicate if cases belong to a topic

  • Weprovidesimpleoptions foryou tocontrol AI results
    There are some other scenarioswhere you want to getbetter AI resultsbutitshard to be automated. Weve been trying to make those scenarios easier for you byproviding simpleconfigurableoptionsand settings. Changes tothose settings will guide the AI algorithmsto generate theresults that work better for your business needs,but again,without requiring youtoknowthe complexityofthose algorithms.For example,data cleaningisknown to be critical – butcomplicated – toimplement inthe AI world, as you can get very poor resultswhenyour datacontainsa lot ofirrelevant information.To solve that problem,Customer Service Insightsprovides data-cleaning settings, where you can get your data cleaned simply witha couple of clicks.
    Clean your case data by a couple of clicks
    Figure 3: clean your case data by a couple of clicks

    In addition, you may want to group topics for your customer service system in a more general ormore specific way.You can control the scope of how your topics are generated bysetting topic granularity in Customer Service Insights.

    Set the scope of topics based on your business needs
    Figure 4: set the scope of topics based on your business needs

While we make all capabilities easy to use in the product,we often getquestions from customerslike youwhoare interested inlearningmore details aboutthese AI capabilities,orwonderhowit works better than other solutions.

Thus,Im glad to introduceyoutoa seriesofAI inside-outblog poststhatwe are planningfor the next few months!

Youllseemonthly posts from our AIexpertsthatcover thekeyareassuch as what is clustering and why its better for topic discoverythan other technologies, what is semantics embedding andwhy itplays a critical rolein Customer Service Insights,howweenableduser-friendly AI controlsto addressreal-world clusteringscenarios, and howour AI modelscontinuouslyimprovewith usersgestures among others.

We will updatethisblog postwith the linkstootherpostsin this seriesonce they arepublished.Please stay tuned!Iftheres any other area you are interestedand would like to learn more, pleasefeel free to leave your commentsbelow.

As always, your feedback is critical for us to prioritize whats nextwithin the product. If you have any suggestions or ideas, please dont hesitate tosubmit an idea or vote on others ideas.

If you have questions aboutCustomer Service Insights, were always available at theCustomer Service Insights forumto help you.

Enjoy!

To learn more, visit:

The post Advanced AI topic clustering in Customer Service Insights appeared first on Dynamics 365 Blog.

Assisted setup module in Dynamics 365 Business Central

$
0
0

INTRODUCTION

This module contains all pages that are used by assisted setup guidesin Business Central. Assisted setup guides provide step-by-stem guidance that helps simplify the process of setting up complex features.

WHAT HAS BEEN DONE

We have combined the assisted setup capabilities that already existed in Business Central in this module. If your extension provides setup assistance through a guide, you can add that guide to Assisted Setup for easy discoverability.

WHAT THE MODULE PROVIDES

The Assisted Setup module provides capabilities for:

  • Adding anassisted setup guide for a given extension, page ID, an optional video link that explains the feature, and a help link where the user can read more about it.
  • Adding a translation for the name of the setup record. This is helpful when the extensionis available in multiple languages.
  • Checking whethera user has already completed the steps in an assisted setup guide.
  • Completing an assisted setup guide, typically from the guide itself when the user clicks Finish.
  • Running an assisted setup guide page that takes the user through the various steps to setup an extension.

You can subscribe to the following events in the module:

  • The OnRegister eventnotifies the extensions that they can add setups.
  • The OnReRunOfCompletedSetup event notifies that someone is runninga guide that has already beencompleted.
  • The OnAfterRun event lets subscribers react when a user completes the steps in a guide.

USAGE EXAMPLE

The Base Applicationadds quite a few assisted setup guides by subscribing to the OnRegisterevent. In the following example, the Data Migration Wizard is being added to the Assisted Setup through the API exposed for the module. See the details forcodeunit 1814 Assisted Setup Subscribers,which shows that the Data Migration Wizard, is added along with a video and a link to more information. Also, the English (United States) translation for the name is added.

CurrentGlobalLanguage := GLOBALLANGUAGE; // Getting Started AssistedSetup.Add(GetAppId(), PAGE::"Data Migration Wizard", DataMigrationTxt, AssistedSetupGroup::GettingStarted, VideoImportbusinessdataTxt, HelpImportbusinessdataTxt);GLOBALLANGUAGE(1033); AssistedSetup.AddTranslation(GetAppId(), PAGE::"Data Migration Wizard", 1033, DataMigrationTxt); GLOBALLANGUAGE(CurrentGlobalLanguage);
Go to the Assisted Setup module on GitHub by clicking here.

The post Assisted setup module in Dynamics 365 Business Central appeared first on Dynamics 365 Blog.

Translation Module in Dynamics 365 Business Central

$
0
0

INTRODUCTION

This module lets you add and modify language translations for user data, so that people in different reguions can understand the data. For example, this is useful for descriptions of items that you sell, or for providing standard operating procedures in factories located in different regions.

WHAT HAS BEEN DONE

Developers can identify the fields for which to enable translations, and then add a calculated field on the page to show the translations.

WHAT THE MODULE PROVIDES

The Translation module provides capabilities for:

  • Setting translations for a specified field on a record, and a given language.
  • Fetching and showing up the translations for a field on a record.
  • Deleting all translations for a record or for a specified field on it
  • Showing the Translations page for a specified field on all records in a table
  • Checking whether any translations are available

The Translation page shows the Target Language field, whichcontains the targetlanguage, and the Value field, which is the translation. Note that the translation can only be added for a record that is persisted on the database, and not for temporary records.

USAGE EXAMPLE

Page 1801 Assisted Setup in the Assisted Setup module shows the translationsfor each record using a page field TranslatedName. The code examples below show how to make this new field lookup other translations, how to populate the field from the Translation module the first time the page is opened, and how to set the translation for a given field from code.

 field(TranslatedName; TranslatedName) { Caption = 'Translated Name'; ApplicationArea = All; ToolTip = 'Specifies the name translated locally.'; trigger OnDrillDown() var Translation: Codeunit Translation; begin Translation.Show(Rec, FieldNo(Name)); end; }

 

The value is populated during the trigger,
 trigger OnAfterGetRecord() var Translation: Codeunit Translation; begin HelpAvailable := ''; VideoAvailable := ''; if "Help Url" <> '' then HelpAvailable := HelpLinkTxt; if "Video Url" <> '' then VideoAvailable := VideoLinkTxt; TranslatedName := Translation.Get(Rec, FieldNo(Name)); end;

 

The translationsareadded to eachrecord by calling the appropriate API in the Assisted Setup, which in turncalls the following on Codeunit 1813 Assisted Setup Impl.

 procedure AddSetupAssistantTranslation(ExtensionId: Guid; PageID: Integer; LanguageID: Integer; TranslatedName: Text) var AssistedSetup: Record "Assisted Setup"; Translation: Codeunit Translation; begin if not AssistedSetup.Get(PageID) then exit; if LanguageID <> GlobalLanguage() THEN Translation.Set(AssistedSetup, AssistedSetup.FIELDNO(Name), LanguageID, TranslatedName); end;

Go to the Translation module on GitHub by clicking here.

The post Translation Module in Dynamics 365 Business Central appeared first on Dynamics 365 Blog.

Microsoft Dynamics 365 Banking Accelerator adds BIAN API implementation

$
0
0

Following the general availability release of the Microsoft Banking Accelerator in July, today we are releasing additional API sample implementations to provide interoperability with Banking Industry Architecture Network (BIAN) API service domains for consumer loans and collateral. BIAN is a not-for-profit association to establish and promote a common architectural framework to enable banking interoperability. Microsoft and BIAN have been working together to help unlock new open banking opportunities by allowing organizations to more seamlessly and consistently share banking-specific data across disparate systems.

Were excited to partner with the BIAN community to release the API implementation examples and accelerator enhancements, which are available today here on GitHub with setup documentation located here on GitHub.

BIAN 1

The new BIAN API sample implementation includes a Visual Studio solution, guided documentation for creating a hosted API, and a set of API controllers containing the methods associated with the BIAN API endpoints to connect to the Microsoft Common Data Service Web API. With this sample, other apps that conform to BIAN can perform operations with Dynamics 365 with the Banking Accelerator installed.

a screenshot of a computer

For example, using the BIAN API sample implementation, a consumer loan process can be started within Dynamics 365 and send operations back and forth over the course of fulfilling the consumer loan. The sample also includes API controllers for collateral to support consumer loans that may be secured by collateral. We encourage developers to try out the sample on GitHub and well be adding more service domains and working with the BIAN community to publish these definitions on both GitHub and the BIAN API Exchange.

a screenshot of a cell phone

 

In addition to the BIAN API samples, we are continually enhancing the Banking Accelerator and today released an update (v1.0.3.1) on AppSource bringing additional use cases for Retail Banking.

The post Microsoft Dynamics 365 Banking Accelerator adds BIAN API implementation appeared first on Dynamics 365 Blog.


Using Power Platform Dataflows to extract and process data from Business Central – The Movie

Boost your ecommerce revenue with Dynamics 365 Fraud Protection

$
0
0

With the booming growth of online technologies and marketplaces comes the burgeoning rise of a variety of cybersecurity challenges for businesses that conduct any aspect of their operations through online software and the Internet. Fraud is one of the most pervasive trends of the modern online marketplace, and continues to be a consistent, invasive issue for all businesses.

As the rate of payment fraud continues to rise, especially in retail ecommerce where the liability lies with the merchant, so does the amount companies spend each year to combat and secure themselves against it. Fraud and wrongful rejections already significantly impact merchants bottom-line in a booming economy and as well as when the economy is soft.

The impact of outdated fraud detection tools and false alarms

Customers, merchants, and banking institutions have been impacted for years by suboptimal experiences, increased operational expenses, wrongful rejections, and reduced revenue. To combat these negative business impacts, companies have been implementing layered solutions. For example, merchant risk managers are bogged down with manual reviews and analysis of their own local 30/60/90-day historical data. These narrow, outdated views of data provide a partial hindsight view of fraud trends, leaving risk managers with no real-time information to work with when creating new rules to hopefully minimize fraud loss.

One of the most common ways that fraud impacts everyday consumers and business is through wrongful rejections. For example, when a merchant maintains an outdated and/or strict set of transaction rules and algorithms, a customer who initiates a retail ecommerce transaction through a credit card might experience a wrongful rejection known to consumers as a declined transaction, because of these outdated rules. Similarly, wrongful declined transactions can also happen when the card issuing bank refuses to authorize the purchase using the card due to suspicion of fraud. The implications of these suboptimal experiences for all parties involved (customers, merchants, and banks) directly correlates into loss of credibility, security, and business revenue.

Introducing Microsoft Dynamics 365 Fraud Protection

As one of the biggest technology organizations in the world, Microsoft saw an opportunity to provide software as a service that effectively and visibly helps reduce the rate and pervasiveness of fraud while simultaneously helping to reduce wrongful declined transactions and improving customer experience. Microsoft Dynamics 365 Fraud Protection is a cloud-based solution merchants can use in real-time to help lower their costs related to combatting fraud, help increase their revenue by improving acceptance of legitimate transactions, reduce friction in customer experience, and integrate easily into their existing order management system and payment stack. This solution offers a global level of fraud insights using data sets from participating merchants that are processed with real-time machine learning to detect and mitigate evolving fraud schemes in a timely manner.

Microsoft Dynamics 365 Fraud Protection houses five powerful capabilities designed to capitalize on the power of machine learning to provide merchants with an innovative fraud protection solution:

  • Adaptive AI technology continuously learns and adapts from patterns and trends and will equip fraud managers with the tools and data they need to make informed decisions on how to optimize their fraud controls.
  • A fraud protection network maintains up-to-date connected data that provides a global view of fraud activity and maintains the security of merchants confidential information and shoppers’ privacy.
  • Transaction acceptance booster shares transactional trust knowledge with issuing banks to help boost authorization rates.
  • Customer escalation support provides detailed risk insights about each transaction to help improve merchants customer support experience.
  • Account creation protection monitors account creation, helps minimize abuse and automated attacks on customer accounts, and helps to avoid incurring losses due to fraudulent accounts

See the image below to learn more about the relationship between merchants and banks when they both use Dynamics 365 Fraud Protection:

Dynamics 365 Fraud Protection increases bank acceptance rates and decreases false positives by sharing transaction risk exposure with issuers so they can make more informed assessments.

Banks worldwide can choose to participate in the Dynamics 365 Fraud Protection transaction acceptance booster feature to increase acceptance rates of legitimate authorization requests from online merchants using Dynamics 365 Fraud Protection. Merchants using the product can opt to use this feature to increase acceptance rates for authorization requests made to banks without having to make any changes to their existing authorization process.

Learn more

This week at Sibos 2019 in London, Microsoft will be showcasing its secure and compliant cloud solutions for the banking industry. Read a round-up of announcements unveiled at Sibos and view an agenda of Microsoft events and sessions at the show. Stop by our booth (Z131) for a showcase of applications relevant to banking, including Microsoft Dynamics 365 Fraud Protection, which will be generally available on October 1st, 2019. Contact your Microsoft representative to get started.

The post Boost your ecommerce revenue with Dynamics 365 Fraud Protection appeared first on Dynamics 365 Blog.

Block time off for resources on the schedule board without changing the working hours

$
0
0

I was recently chatting with a dispatcher who expressed that they would like to block time off on the schedule board for resources without needing to interact with the working hours.

I am switching gears on this post putting on my implementor hat. We will look for ways to enable this scenario natively in the product but until then, your customers may have this need. Here is a creative way you may be able to accomplish the request. It isn’t perfect, but an idea nonetheless!

The goal is for a dispatcher to stay completely within the schedule board yet block time off for a resource.

At a high level, we will create a new entity and enable it for scheduling. This entity is strictly for tracking “blocked time”. Then, we will create bookings for this entity from the schedule board to block the time. We will ensure these bookings look visually different on the schedule board.

The Setup:

Enabling the "block time" entity for scheduling

 

  • Create a record in the block time table.
Creating a "block time" record called "block time on the board"
  • Create a resource requirement for that block time record. There is no need to enter a duration; you just need the name.
Create a new resource requirement record
  • Create a resource requirement view that shows all requirements if the booking setup metadata = Block Time
    • You only need one resource requirement for the entire organization, but you could opt to create different ones for different territories or dispatchers. That is up to you! In this case, I am assuming there is only one record that the entire organization shares.
Creating a view on the resource requirement entity where booking setup metadata = "block time"
  • Optionaladd a new booking requirements tab (requirements panel) to the schedule board dedicated to showing this “block time” requirement.
    • Pro Tip! You can add this to the default schedule board, and then all schedule boards will see this tab unless a board is set to hide default requirement panels.

 

The Action:

Dispatchers can now block off time in two ways:

Option 1: Just drag empty space on the board where you want to block off time. This will pop out the requirements panel. Find the requirement we created earlier and select it. If it is not included in the view, you may have to change views to see that requirement.

Drag empty space on the schedule boardSelect requirement in the flyout panel created earlierBooking showing on the board created for blocking time

Option 2: The second option applies if you implemented the optional step above where we added a booking requirements tab/panel with the “block time” requirement. You can just drag the block time requirement to the resource and time you would like to block off. Since there is no duration on the requirement, it will default to the duration set on the BSM record. You can always drag to extend or reduce the duration.

Here you can see me dragging the “block time” requirement from the “block time” requirement panel to Joseph at 3 PM.

Dragging from bottom panel to board

Here you can see the booking blocking the time that Joseph created:

Booking created after dragging to the board

Notice the requirement is not removed from the bottom panel. This is because the filter we created does not exclude bookings which have already been scheduled. This allows you to drag this requirement repeatedly to create “block time” bookings.

You can also accomplish this from the multi-day schedule boards by selecting the day, week, or month you would like to block off. Make sure you select the requirement, and how much time you want to block. If you want to block the entire selected duration, just choose full capacity.

On the daily schedule board, selecting the "block time" requirement, the day I want to block, and changing the booking method to full capacity and booking itSchedule board showing 8 hours booked with the underlying booking on the daily board

Additional Tips:

  • You can use a different booking status for time off. Just create a new booking status and set this status to be the default status on the block time booking setup metadata record. You can change the color of the status, so it looks different on the board. You can also make sure this status cannot be used for bookings related to any other entities. To learn more about booking statuses, check out the documentation and my previous blog post for even more advanced functionality.

Here I am creating a new booking status and setting the color to gray:

a screenshot of a cell phone

Here I am setting the default committed status on the “block time” booking setup metadata record to the status I just created:

Changing the default booking status on the booking setup metadata record to the "block time status"

Here you can see the booking appearing in gray on the board:

"Block time" booking with the color gray
  • You can also change the template for the block time bookings to show whatever info you would like. Just modify the booking template on an individual board, or at a global level for all users. Remember, you can have a different template for each schedulable entity, meaning you can have different information appear on “block time” bookings than work order bookings.
  • If you are using RSO, make sure you set the booking status you plan to use to do not move so resource scheduling optimization knows not to move these bookings.
  • If you use the apply filter territory setting on the board, which filters the requirements panel based on the territory searched in the filter panel, you will need a different requirement per territory to use the tabs on the bottom.
Apply filter terrirory setting under the setting icon on the schedule board

Cautions:

  1. By using bookings, make sure your reports exclude these block time bookings.
  2. The resource template percentage and hours booked calculations will treat these hours as if they are booked. This may be an issue or may not matter depending on your implementation and usage of those numbers.
Schedule board showing the resource cell percentage and booked duration reflecting the "block time" bookings. Also shows the multi-day view showing those hours as booked

Happy scheduling!

Dan Gittler, Principal PM

The post Block time off for resources on the schedule board without changing the working hours appeared first on Dynamics 365 Blog.

The DynamicsFinancials connector will be turned off September 30, 2019

$
0
0

Last year, we published a notification regarding the deprecation of the Project Madeira and Dynamics 365 for Finance and Operations, Business edition connectors for Power BI, Microsoft Flow and PowerApps, the DynamicsFinancials connector. The deprecation was to happen on December 31, 2018. Due to a larger number of customers still using the old connector, we continued to make it available in our service to give users more time to make their updates. This time is now up.

On September 30, 2019 the DynamicsFinancials connector will be turned off in our service. Any existing PowerApp, Logic App, or Flow using this old connector will no longer work.

An update to the Dynamics 365 Business Central connector used for Microsoft Flow, Logic Apps, and PowerApps was released the week of September 16, 2019, and we recommend that you use this as a replacement.

Call to action

Update any existing Microsoft Flows, Logic Apps, or PowerApps that were created using the DynamicsFinancials connector.

Instructions on how to update to the latest connector can be found in this blog post.

 

The post The DynamicsFinancials connector will be turned off September 30, 2019 appeared first on Dynamics 365 Blog.

Announcing Customer Service Insights availability in France geographic area

$
0
0

We are very excited to announce that Dynamics 365 Customer Service Insights is now available in France-based datacenters! This is in addition to the nine other geographic areas weve supported since general availability.

When Customer Service Insights creates workspaces, it generates and stores the insights data in the same geographic area as where your case data is stored in the Common Data Service (CDS) database. Given the CDS has recently expanded the geographic area to France, this support ensures all insights data generated from your cases stored in France will stay in France. It can also improve the performance of your workspace refresh.

Below is a full list of data locations that Customer Service Insights supports today. More details can be found in the article Where an organization’s Customer Service Insights data is located. You can also learn more from this article about Microsoft Cloud France.

Azure geographic areas (geos)Azure datacenters (regions)
EuropeWest Europe (Netherlands)
North Europe (Ireland)
United StatesEast US (Blue Ridge, VA)
South Central US (Des Moines, IA)
West US (Quincy, WA)
AustraliaAustralia East (New South Wales)
Australia Southeast (Victoria)
Asia PacificSoutheast Asia (Singapore)
East Asia (Hong Kong)
United KingdomUK South (London)
UK West (Cardiff, Durham)
BrazilBrazil South (So Paulo State)
CanadaCanada Central (Toronto)
Canada East (Qubec City)
IndiaCentral India (Pune)
South India (Chennai)
JapanJapan East (Tokyo, Saitama)
Japan West (Osaka)
FranceFrance Central (Paris)
France South (Marseille)

 

As always, your feedback is critical for us to prioritize whats next. If you have any suggestions or ideas, please dont hesitate to submit an idea or vote on others ideas.

If you have questions about this support or any other features, were always available at the Customer Service Insights forum to help you.

 

Enjoy!

 

To learn more, visit:

The post Announcing Customer Service Insights availability in France geographic area appeared first on Dynamics 365 Blog.

New process to submit support requests for Dynamics 365 Business Central

$
0
0

The Business Central Administration Center has been updated so that partners can submit requests for support. Use the question mark in the top right corner, and then, from the menu, choose the New Support Request menu item.

a screenshot of a cell phone

 

 

This link directs you to the Power Platform Admin Center where you submit the actual support request.

Once you’re on the support request page, the customer that you are submitting on behalf of shows in the information pane automatically. Next, specify the product as Dynamics 365 Business Central, then the problem type and category.

A new feature with this portal is suggested solutions. Just type something in the Tell us what you need help with area, then choose See solutions. Choose any of the suggested solutions to see the detailed documentation. If these do not resolve the issue, then click Create a support request at the bottom of the pane.

On the next screen you must enter your support Access ID and Contract ID/password. For help with this information, contact your account manager. This is a one-off for your organization, and if you have to submit a support request later, the information is saved so that you can just choose it in the request pane.

Next, enter the issue title and a description, and optionally upload attachments. Choose Next. Finally, enter your contact information and submit the ticket.

Additionally, you can start from the Microsoft Partner Center to direct you to the Power Platform Admin Center. For more information, see View solutions or enter a support request through the new support center in the PowerApps administration content.

The post New process to submit support requests for Dynamics 365 Business Central appeared first on Dynamics 365 Blog.

Release Notes for Project Service Automation Update Release 19, V2.4.16.40

$
0
0

Were pleased to announce the latest update for the Project Service Automation application for Dynamics 365. This release includes bug fixes.

This release is compatible with Dynamics 365 9.x. To update to this release, visit the Admin Center for Dynamics 365 online, solutions page to install the update. For details, refer How to Install, Update a Preferred Solution

 

Project Service Automation (V 2.4.16.40)

  • Bug Fixes
  • Fixed: Out of the Box webclient SalesOrder form ribbon is flashing when msdyn_OrderType is null.
  • Fixed: Adding Ampersand (‘&’) In Time Entry Rejection Comments throws error.

 

Were pleased to announce the latest update for the Project Service Automation, V2.4.15.33, application for Dynamics 365. This release includes bug fixes.

This release is compatible with Dynamics 365 9.x. To update to this release, visit the Admin Center for Dynamics 365 online, solutions page to install the update. For details, refer How to Install, Update a Preferred Solution

Bug Fixes

  • Fixed: Entity Views not following ‘Only Related Records’ data source rules.

End of Life Notice

Note: After February 2020, we will no longer support Field Service and Project Service Automation legacy versions.

This includes:

  • Project Service Automation (PSA) version 1.x or 2.x
  • Field Service (FS) version 6.x or 7.x

Please ensure that you have plans to upgrade to the latest version of these solutions. These solutions should be upgraded no later than February 2020.

Administrators can follow instructions to enable their environments to install the latest PSA or FS, or you can contact support to enable the install on your organizations.

We have published a quick start guide with special focus on Field and Project Service to help organizations get started in the upgrade process.

Lastly, please plan to conduct the upgrade and test your use cases in a non-production environment prior to conducting this upgrade on a production environment.

 

 

Project Service Automation Team

The post Release Notes for Project Service Automation Update Release 19, V2.4.16.40 appeared first on Dynamics 365 Blog.


IDMF – Configuration Setting – UpdateMasterSyncTablesPostMetaDataSync

$
0
0

Consider a scenario where you are using IDMF 2.0, and you find that after each IDMF Metadata sync the Master table list gets refreshed and you then have to manually sort out the list each time by unchecking/checking tables as Master tables. Well, there’s a simple way to change that default behaviour:

1. Update AXDataManagementSchedulerService.exe.config file and set UpdateMasterSyncTablesPostMetaDataSync = false
2. Restart the IDMF service

Dynamics 365 for Finance and Operations – Things to consider when exporting data to BYOD

$
0
0

I’ve worked on some cases related to Data Management exports to BYOD recently and I thought I would share some observations on things you should consider when exporting data to BYOD. This blog post is not meant to serve as a comprehensive document outlining how the process works, but rather to raise awareness of the sorts of factors that may impact how consistently successful your exports to BYOD will be. Much of this is common sense and non-technical, and it is valid for other data movement scenarios using different technologies.

If you are experiencing data exports failing or only partially succeeding on a recurring basis then I suggest that you look at the following in your efforts to try to track down a cause and resolve such issues:

EXAMPLE SCENARIO

You are regularly exporting data from Dynamics 365 for Finance and Operations to a BYOD Azure SQL Database

Bring your own database (BYOD)
https://docs.microsoft.com/en-us/dynamics365/unified-operations/dev-itpro/analytics/export-entities-to-your-own-database

1) What kind of error pattern are you seeing?

Do the exports always fail or is the problem intermittent?
Are you seeing this across a bunch of different exports or is a specific export failing very often?
Does there appear to be any date/time pattern to the export failures?
Do the failures seem to occur only for exports where a lot of data is being pushed to the BYOD Azure SQL Database?

2) How are your exports configured?

Are you exporting relatively static data or dynamic (e.g. transactional) data?
How many entities are you trying to export in one go? Might it be possible to split the workload into several more granular exports?
Are you taking full advantage of change tracking and INCREMENTAL PUSH where possible and appropriate?
Are you doing regular FULL PUSH of data that continues to grow in size each time it is exported?

3) Where is your BYOD Azure SQL Database and what service tier is it running on?

If you are pushing data to a different data centre or your BYOD service tier is “too low” resulting in throttling of deletes and inserts then that could lead to timeouts.

Single database: Storage sizes and performance levels
https://docs.microsoft.com/en-us/azure/sql-database/sql-database-dtu-resource-limits-single-databases#single-database-storage-sizes-and-performance-levels

For optimal performance you should ideally be running your BYOD Azure SQL Database on a Premium service tier, i.e. P2 or higher.

Setup for sales tax which should be used when ‘Price include sales tax’ is enabled on a Store

$
0
0

Hello,

If you have any issues regarding sales tax (net amount) rounding between Retail store transactions and posted sales invoice voucher transactions, you should check at first Sales tax setup.

The recommended setup for sales tax which should be used when ‘Price include sales tax’ is enabled on a Store.

  1. ‘Marginal base’ to set up as ‘Net amount per line’ for Sales tax code:

Tab Calculation:

  • Origin= Percentage of net amount
  • Marginal base= Net amount per line
  • Calculation method= Whole amount

2. ‘Rounding by’ recommended to have ‘Sales tax codes’ for Sales tax group:

It is very important to know that once you did some changes on sales tax setup in HQ, you should run Job 1080 (Tax) and then be sure that Retail server is restarted, because changed tax setup might be cached (for immediate affect, otherwise Retail Server pool is refreshed in 30 min).

With that setup you should always get the same calculation of all amounts (Net amount and Sales tax amount) in POS/Retail and in HQ transactions.

Also a suggestion to change sales tax setup only after working hours when nobody is working to be sure that the changes are done correctly, distributed to POS and there are no differences between HQ and POS or their transactions.

The same is valid for AX2012 and D365.

 

Best regards,

Ramune Peckyte

Microsoft EMEA Customer Services and Support

PL – Retail – Advance invoices for prepayments in retail

$
0
0

Hello,

The hotfix KB 3034468 released for this feature ‘Advance invoices for prepayments in retail ‘. The link to LCS:

https://fix.lcs.dynamics.com/issue/results/?q=3034468

This feature is based on the regular Polish functionality of advance invoices in AX and extends the functionality to process advance invoices when a Customer order Deposit is registered in POS or a final Payment for a Customer order is processed. The following is a high-level overview of the process:

There are only two setup steps required to enable and use this functionality in POS (in addition to the regular setup of advance invoices in AX):

  1. Set Retail parameters \ Customer orders \ Create advance invoice for deposit to enable the functionality;
  2. Configure a windows printer in the Hardware profile of POS to be able to print advance invoices from POS.

The feature does not change user scenarios in POS.

 

Best regards,

Ramune Peckyte

Microsoft EMEA Customer Services and Support

How to reset approved expense report document back to draft status in Dynamics 365 for Finance and Operations

$
0
0

How to reset approved expense report document back to draft status in Dynamics 365 for Finance and Operations

New functionality is added to provide the ability to reset approved expense report document back to draft status.

This is available in App8.0+ version. There was a hotfix for App7.2 and App7.3. Here is the recommendation for different versions:

App7.2 customers to install KB4462583

https://fix.lcs.dynamics.com/Issue/Details?kb=4462583&bugId=242339&qc=f4bd88056d3bf176b2b3c366d282b73fd6e3e0bb00c7b3e0b86ab96096b89cde

App7.3 customers to install KB4462581

https://fix.lcs.dynamics.com/Issue/Details?kb=4462581&bugId=242340&qc=cc0e073c673649b61ece3c6c04146ef87125e8326502f9754751a6dfe16ffc69

Go to Expense management > Periodic tasks > Reset expense document status.

Select the expense report(s) that are stuck and click on “Update to draft” button.

Once the operation is completed, go to the expense report(s) and validate the status to be draft.

 

Viewing all 678 articles
Browse latest View live


<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>