Thursday, April 18, 2019

Source: https://docs.microsoft.com/en-us/dynamics365/unified-operations/dev-itpro/analytics/export-entities-to-your-own-database

This topic explains how administrators can export data entities from Microsoft Dynamics 365 for Finance and Operations into their own Microsoft Azure SQL database. This feature is also known as bring your own database (BYOD). The BYOD feature was released in Microsoft Dynamics AX with platform update 2 (August 2016). Minor improvements and bug fixes have been included in subsequent platform updates.
The BYOD feature lets administrators configure their own database, and then export one or more data entities that are available in Finance and Operations into it. (Currently, more than 1,700 data entities are available.) Specifically, this feature lets you complete these tasks:
  • Define one or more SQL databases that you can export entity data from Finance and Operations into.
  • Export either all the records (full push) or only the records that have changed or been deleted (incremental push).
  • Use the rich scheduling capabilities of the Finance and Operations batch framework to enable periodic exports.
  • Access the entity database by using Transact-SQL (T-SQL), and even extend the database by adding more tables.

Entity store or BYOD?

If you followed the series of blog posts about Microsoft Power BI integration, you will be familiar with Entity store. Entity store is the operational data warehouse that is included with Finance and Operations. Entity store provides built-in integration of operational reports with Power BI. Ready-made reports and analytical workspaces that are built into Finance and Operations use Entity store. If you write Power BI reports by using data in your Finance and Operations environment, you should use Entity store.
However, the BYOD feature is recommended for the following scenarios:
  • You must export data from Finance and Operations into your own data warehouse.
  • You use analytical tools other than Power BI, and those tools require T-SQL access to data.
  • You must perform batch integration with other systems.
 Note
Finance and Operations doesn't allow T-SQL connections to the production database. If you're upgrading from a previous version of Finance and Operations, and you have integration solutions that require direct T-SQL access to the database, BYOD is the recommended upgrade path.
As a customer of Finance and Operations, you can use either Entity store or BYOD. The default operational reports that are available take advantage of embedded Power BI and Entity store. We recommend that you use our default operational reports as your first choice. You can also extend the ready-made operational reports to meet your requirements. You should consider BYOD a complementary option that you use as you require.

Creating a SQL database

Before you can configure the entity export option and use the BYOD feature, you must create a SQL database by using Azure portal.
For one-box development environments, you can create a database in the local Microsoft SQL Server database. However, this database should be used only for development and testing purposes. For production environments, you must create an Azure SQL database.
You should also create a SQL user account for sign-in to the database. Write down the server name, database name, and the SQL user ID and password. You will use this information when you configure the entity export option in the next section.
If you're using the BYOD feature for integration with a business intelligence (BI) tool, you should consider creating a SQL premium database. Premium databases support clustered columnstore indexes (CCIs). CCIs are in-memory indexes that improve the performance of read queries that are typical in analytical and reporting workloads. If you're using the BYOD feature to export data into a staging database or for general integration purposes, you can use a standard database.

Configuring the entity export option

  1. Start the Finance and Operations client, and then, in the Data management workspace, select the Configure Entity export to database tile.
  2. If you've configured any databases, a list is shown. Otherwise, you must configure a new database. In this case, select New, and then enter a unique name and a description for the new database. Note that you can export entities into multiple databases.
  3. Enter the connection string in the following format:
    Data Source=<logical server name>,1433; Initial Catalog=<your DB name>; Integrated Security=False; User ID=<SQL user ID>; Password=<password>
    In this connection string, the logical server name should resemble nnnn.database.windows.net. You should be able to find the logical server name in Azure portal. The following illustration shows an example of a connection string.
    Connection string on the New record page
  4. Select Validate, and make sure that the connection is successful.
    • The Create clustered column store indexes option optimizes the destination database for selected queries by defining CCIs for entities that are copied from Finance and Operations. However, CCIs are currently supported only on SQL premium databases. Therefore, to enable this option, you must create a SQL premium database.
    • The Enable triggers in target database option sets export jobs to enable SQL triggers in the target database. This option lets you hook downstream processes into the trigger to orchestrate actions that must be started after records have been inserted. One trigger is supported per bulk insert operation. The size of the bulk insert is determined by the Maximum insert commit size parameter in the Data management framework.
For scenarios in which reporting systems read data from BYOD, there is always the challenge of ensuring that the reporting systems get consistent data from BYOD while the sync from Finance and Operations is in progress. You can achieve this result by not having the reporting systems read directly from the staging tables created by the BYOD process. The staging tables hold the data while data is being synced from the Finance and Operations instance and hence will be constantly changing. Use the SQL trigger feature to determine when the data sync from Finance and Operations has been completed, and then hydrate the downstream reporting systems.
When the validation is passed, the database that you configured for entity export appears in lists of databases, as shown in the following illustration.
Database for entity export
You can now publish one or more entities to the new database by selecting the Publish option on the menu.

Publishing the entity schema to the database

The Publish page enables several scenarios:
  • Publish new entities to the database.
  • Delete previously published entities from the database. (For example, you might want to re-create the schema.)
  • Compare published entities with the entity schema in Finance and Operations. (For example, if new fields are added to Finance and Operations later, you can compare the fields with your database schema.)
  • Configure change tracking functionality that enables incremental updates of your data.
The following sections discuss each option.

Publish

The Publish option defines the entity database schema on the destination database. When you select one or more entities, and then select the Publish option, a batch job is started. This job creates the entities in the destination database. When the database definition job is completed, you receive a message, which you can access by using the bell symbol in the upper right.
The actual data update occurs when you export data. At this point, you're just creating the schema.

Drop entity

The Drop entity option deletes the data and the entity definition from the destination database.

Compare source names

The Compare source names option lets you compare the entity schema in the destination with the entity schema in Finance and Operations. This option is used for version management. You can also use this option to remove any unwanted columns from the destination table.

Configure change tracking

Change tracking is a feature that is provided in SQL Server and SQL Database. Change tracking enables the database to track changes including deletes that are made on tables. The system uses change tracking to identify changes that are made to tables as transactions in Finance and Operations. However, because Finance and Operations must track changes at the data entity level, there is additional logic on top of SQL change tracking to make this functionality work. The steps to enable change tracking are explained later in this section.
The Change tracking option on the Publish page lets you configure how changes are tracked on the underlying entity.
Change tracking on the underlying entity
The following table describes the change tracking options that are available.
OptionDescription
Enable primary tableAn entity consists of several tables. Select this option to track all changes that are made to the primary table of the entity. When changes are made to the primary table, the corresponding record is inserted into or updated in the destination database. Although data from the whole entity is written to the destination table, the system triggers the insert or update option only when the primary table is modified.
Enable entire entitySelect this option to track all changes to the entity. (These changes include changes to all the tables that make up the entity.) When changes are made to the entity, corresponding updates are made to the destination.
Enable custom queryThis option lets a developer provide a custom query that the system runs to evaluate changes. This option is useful when you have a complex requirement to track changes from only a selected set of fields. You can also select this option when the entities that will be exported were built by using a hierarchy of nested views. For more information, see Enable change tracking for an entity by using a custom query.
To use change tracking, you must enable the Change tracking option as shown above in data management. This action is available on the Data entities list page, by going to Data management > Data entities. You need to select an entity and select from one of the options listed above to enable change tracking on the data entity.
If you republish an entity that exists in the destination database, the system warns you that existing data will be deleted because of the new operation.
When you confirm the publish operation, the system publishes the schema to the database, and you're notified when the operation is completed.
By selecting the Show published only option on the Publish page, you can show only the entities that were published to a given destination database. The Publish function creates the entity schema in the database. You can navigate to the database and see the table schemas that were created, together with corresponding indexes.
 Note
Currently, you can't use BYOD to export composite entities into a database. You must export each entity in the composite entity.

Exporting data into your database

After entities are published to the destination database, you can use the Export function in the Data management workspace to move data. The Export function lets you define a Data movement job that contains one or more entities.
You can use the Export page to export data from Finance and Operations into many target data formats, such as a comma-separated values (CSV) file. This page also supports SQL databases as another destination.
Export page
You can create a data project that has multiple entities. You can schedule this data project to run by using the Finance and Operations batch framework. You also schedule the data export job to run on a periodic basis by selecting the Export in batchoption.
The same job can also be used to export data from all companies. This feature can be enabled by enabling the flight DMFEnableAllCompanyExport as explained in Features flighted in data management and enabling flighted features.
 Note
Use of recurring exports in Manage > Manage recurring data jobs for BYOD is discouraged. You must use the Export in batchoption.

Incremental export

When you add an entity for data export, you can select to do an incremental export (which is also known as incremental push) or a full push. For incremental push to work, you must enable the Change tracking option in the Finance and Operations database and specify an appropriate change tracking option, as described earlier in this topic.
 Note
A full push deletes all existing records from an entity and then inserts the current set of records from the selected entity.
If you select an incremental push, the first push is always going to be a full push. This is because SQL needs to know which records have been 'tracked' in order to be able to track subsequent changes. Whenever a new record is inserted, or a record is added or deleted, the corresponding change will be reflected in the destination entity.
Because the first push is always a full push, we do not recomend that you do an explicit full push before you enable change tracking.
We recommend that you first enable change tracking and schedule a export job with incremental push. This will take care of the first full push and the subsequent incremental exports.

Timeouts

The default timeouts for BYOD exports are set to ten minutes for truncation operations and one hour for actual bulk insert operations. When volumes are high, these timeout settings may not be sufficient and must be updated. Starting with the release of Platform update 18, you can update the timeout settings by navigating to Data management > Framework parameters > Bring your own database.

Known limitations

The BYOD feature has the following limitations.

There should be no active locks on your database during synchronization

Because BYOD is your own database, you must ensure that there are no active locks on your Azure SQL database when data is being synced from Finance and Operations. Having active locks on your database during synchronization can result in slow writes or even failure to export to your Azure SQL database.

You can't export composite entities into your own database

Currently, composite entities aren't supported. You must export individual entities that make up the composite entity. However, you can export both the entities in the same data project.

Entities that don't have unique keys can't be exported by using incremental push

You might face this limitation especially when you try to incrementally export records from a few ready-made entities. Because these entities were designed to enable the import of data into Finance and Operations, they don't have a unique key. However, you can enable change tracking only for entities that have a unique key. Therefore, there is a limitation on incremental push. One workaround is to extend the required entity and define a unique key.

Tuesday, April 16, 2019

Integrating Dynamics 365 ERP and CRM – Using Microsoft Flow and Common Data Service



Integrating Dynamics 365 ERP and CRM – Using Microsoft Flow and Common Data Service



 Why can’t we just sync data between CRM and ERP using Microsoft Flow only? why do we need to use Common Data Service?
Dynamics 365 unifies CRM and ERP capabilities into applications that work seamlessly together across different areas of a business. Two main components of Dynamics 365 are CRM and ERP applications.
For customers who use both CRM and ERP applications (for example, Dynamics 365 for Sales and Dynamics 365 for operations), it is important that these applications integrate and work together and one option for customers/partners is to use Microsoft Flow and Common data service(CDS)
In most cases, mapping of data between these products cannot be done directly. Same data may be stored in different ways or formats in these applications.
A simple example is, how Currency is usedIn Dynamics 365 for Sales, Customer’s currency is stored with currency name as US Dollar whereas, in Dynamics 365 for operations, same data is stored with currency code as USDIn such cases, we can have a staging database in CDS to map as required in the destination application.
Very well explained, full article is here

https://suvidhablogpost.wordpress.com/2017/03/29/integrating-dynamics-365-erp-and-crm-using-microsoft-flow-and-common-data-service/

Sending email by using Python

#This is the code for sending email using Gmail and Python

# Import smtplib for the actual sending function
import smtplib

# Import the email modules we'll need
from email.mime.text import MIMEText

# Open a plain text file for reading.  For this example, assume that
# the text file contains only ASCII characters.
textfile = 'content.txt'

fp = open(textfile, 'rb')
# Create a text/plain message
msg = MIMEText(fp.read())
print msg
fp.close()

me = 'youraccount@gmail.com'
to = 'recipient@gmail.com'

msg['Subject'] = 'The contents of %s' % textfile
msg['From'] = me
msg['To'] = to

# Send the message via our own SMTP server, but don't include the
# envelope header.

#You need to turn on 'Allow less secure apps' to get access to the smtp server
#https://myaccount.google.com/lesssecureapps?pli=1


server = smtplib.SMTP_SSL('smtp.gmail.com', 465)
server.ehlo()

gmail_user =  'youraccount@gmail.com'
gmail_password = 'password'

server.login(gmail_user, gmail_password)
server.sendmail(me, to, msg.as_string())
server.quit()

Allow less secure apps
Turn on 'Allow less secure apps' to get access to the smtp server





Applying SMA10/20, SMA20/50 as trading signals

This is the comparison for results before and after applying SMA10/20 and SMA20/50 in the stock trader. Background Trading 3 stock ma...