Update Guide

Backup Existing Installation - Preparation Steps

Getting upgrade files

Retrieve the zip file from the release page http://releases.local/ and extract the zip folder.

create

*Note: Left click and check if the zip folder is blocked and unblock it by left clicking and going into Properties > General and pressing the Unblock button. If no Unblock is visible, the zip folder is not blocked.

Database backups

In SSMS right click on database (e.g. agr_prod), go into tasks and select “Back Up..”.

  • [General]
    • Backup type: Full
    • Check “Copy-only backup”
    • Create file destination

create

  • [Media Options]
    • Override all existing backup sets.

create

  • [Backup Options]
    • Set back up compression -> Compress backup

create

  • To make sure you have access to the backup file: Right click on the database, click on Tasks and then on Restore and locate your backup file.
    • If you can locate the file you have access to it.
    • If you can not locate the file, you do not have access and should get help from IT.

Remember to take backup for stg and audit as well.

Alternative way to backup the databases:

Click on “Script” on top to generate the script in new window. Close Backup modal. Run backup script with F5.

  • Example script (can be used without steps a to c):

      BACKUP DATABASE [agr5_stg] TO DISK =
      N'E:\AGR backup\agr_stg_backup_5.2.2._2017-12-07.bak' WITH copy_only, noformat,
      init, NAME = N'agr5_stg-Full Database Backup', skip, norewind, nounload,
      compression, stats = 10
    

Remember to take backup for stg and audit as well.

Specific changes for the release

Go through the specifics to the release you are upgrading to. If you are jumping over many versions you need to review the list for all the versions here down below. A good practice is to go through each script, search for “ALTER” in the upgrade scripts to make sure no customer default values are being overwritten with new settings. If you come across some custom changes to procedures/views/functions take a backup of the corresponding object, it might come in handy later!

Specifics

6.2

Audit DB

  • If Audit db is installed the following change must be made on any tracked database before updating.
    1. Locate the stored procedure [audit].[log_data_change]

    2. There are two versions of this procedure in use at customers. Check which version you have and replace the highlighted code with the code provided here below.

      Version 1: create

      Version 2: create

      Code to replace with:

      --AGR Prod db new core.setting
      IF OBJECT_ID('core.setting') IS NOT NULL
          SET @sql =  'SET @auditdb = (SELECT core.get_setting_value(''auditdb''))'
      --AGR Prod db
      IF OBJECT_ID('dbo.settings') IS NOT NULL
          SET @sql =  'SELECT @auditdb = value FROM dbo.settings WHERE name = ''auditdb'''
      --AGR STG db
      IF OBJECT_ID('ctr.data_transfer_settings') IS NOT NULL
          SET @sql =  'SELECT @auditdb = str_value FROM ctr.data_transfer_settings WHERE setting_id = ''audit_db_name'''
      
      EXEC sys.sp_executesql @sql, N'@auditdb SYSNAME OUTPUT', @auditdb = @auditdb OUTPUT
      
      IF @auditdb IS NULL
          RAISERROR('Audit DB name setting missing from database settings table', 16, 1)
      

Translations

  • If dbo.translations had any custom translation keys before update. They need to be re-inserted from dbo.BACKUP_6_2_UPGRADE_translations created in the database upgrade script.
  • See technical manual on cloud translations

Custom procedures

  • In all custom procedures, all logging must be replaced with core.event_insert with the appropriate @event_type_id. See technical manual on event logging
  • In custom workspace actions, getting values from the @query string has become much simpler and some functions like dbo.Json_Parse and dbo.Json_Value2 are now obsolete. All custom action procedures must be adjusted to retrieve value from the @query using the standard MSSQL JSON functions instead. create
  • Procedure dbo.orders_update_values_custom needs special attention since it can contain custom code that is overridden. The procedure is backed up into dbo.orders_update_values_custom_BACKUP_6_2_UPGRADE, so be sure to add any logic from the backup procedure back into the new procedure. The procedure now supports updates on multiple order lines, so input parameters have all been changed to strings. create

SQL Agent Zys job

  • Make sure the Zys job zys.every_ten_minutes, which saves database statistics, is created when core.initial_load was run in the upgrade. If it failed there should be an ERROR event in the event log (core.event_ex) and a warning message should print on your screen. If the Zys job was not created, it is likely due to user rights so you need to ask an IT person for the rights and then create the SQL job.
            -- CREATE SQL AGENT JOB zys.every_ten_minutes
            BEGIN TRY
                IF EXISTS(SELECT * FROM sys.dm_server_services WHERE servicename LIKE 'SQL Server Agent (%' AND [status] = 4)
                BEGIN
                    DECLARE @database_name NVARCHAR(128) = DB_NAME()
                    DECLARE @job_name NVARCHAR(128) = 'AGR : zys.every_ten_minutes : ' + @database_name
                    PRINT ''
                    IF EXISTS(SELECT 1 FROM msdb.dbo.sysjobs WHERE name = @job_name)
                        PRINT 'Creating job "' + @job_name + '" skipped as it already exists'
                    ELSE
                    BEGIN
                        PRINT 'Creating job "' + @job_name + '"'
    
                        DECLARE @job_owner NVARCHAR(128)
                        IF EXISTS(SELECT 1 FROM sys.server_principals WHERE [type_desc] = 'SQL_LOGIN' AND [name] = 'sa' AND is_disabled = 0)
                            SET @job_owner = 'sa'
    
                        BEGIN TRANSACTION
    
                        DECLARE @job_id UNIQUEIDENTIFIER
    
                        EXEC msdb.dbo.sp_add_job
                            @job_name = @job_name,
                            @enabled = 1,
                            @description = 'Executes proc zys.every_ten_minutes every 10 minutes, action inside proc depends on settings table',
                            @owner_login_name = @job_owner,
                            @notify_level_eventlog = 0, -- Never
                            @job_id = @job_id OUTPUT
    
                        EXEC msdb.dbo.sp_add_jobstep
                            @job_id = @job_id,
                            @step_id = 1,
                            @step_name = @job_name,
                            @command = 'EXEC zys.every_ten_minutes',
                            @database_name = @database_name
    
                        EXEC msdb.dbo.sp_add_jobschedule
                            @job_id = @job_id,
                            @name = @job_name,
                            @enabled = 1,
                            @freq_type = 4,            -- Daily
                            @freq_interval = 1,        -- Unused
                            @freq_subday_type = 4,     -- Minutes
                            @freq_subday_interval = 10 -- Every 10 minutes
    
                        EXEC msdb.dbo.sp_add_jobserver @job_id = @job_id
    
                        COMMIT TRANSACTION
                    END
                END
            END TRY
            BEGIN CATCH 
                IF @@TRANCOUNT > 0
                    ROLLBACK TRANSACTION
                EXEC core.catch_error @object_name = 'CREATE SQL AGENT JOB zys.every_ten_minutes'
            END CATCH

Item extra info -> Product Item SKU Extra Info

Everything that has to do with “Item Extra Info” is replaced with “Product Item SKU Extra Info”, both on prod and staging. The purpose is the same, to get non-standard extra data on item level into the system. We also get the additional “Product Item Extra Info” (Item no level) and “Location And Vendor Extra Info” (Location level). They are meant for any extra data that is stored in the ERP on the product level or location level respectively. Using them, we don’t have to put all extra data down to the lowest level, saving us space and improving performance in workspaces. It is recommended to map extra info data to the appropriate level, for example data on the “Item No” level should go into Product Item Extra Info, rather than Product Item SKU Extra Info.

Since MBE reports only work on the Item ID (Product Item SKU) level, it is necessary to continue mapping the extra info used in reports down to that level. This will not be necessary after reports have been upgraded to use the workspace engine, which is on the development roadmap. Until then, map data to the prep_cus.product_item_extra_info and from there down to prep_cus.product_item_sku_extra_info.

On staging you will see three new objects on the prep_cus schema for each dimension; a table, a view and a procedure. The xxx_extra_info_populate procedure and v_xxx_extra_info view are both used to populate the xxx_extra_info table, using staging elements. The view should be enough for retrieving most extra_info data, but the procedure is there for any columns that we need to calculate or apply complicated logic that doesn’t fit well into a view.

create

What you need to do manually in upgrade

  • Comment out of the update script any code that causes an error due to item_extra_info. Remember what you commented out, as you need to run it after the manual changes.

  • dbo.item_extra_info is backed up in the upgrade. Make sure that all the columns from item_extra_info are included in one of the new extra_info tables. You can do this with minimum effort by copying everything from the cus.item_extra_info_populate procedure into prep_cus.product_item_sku_extra_info procedure, but consider using this occasion to map the data to the appropriate level, rather than all data down to SKU lvl. Also, you can clean up a bit by using views to select all the extra_info columns at once, in stead of having multiple UPDATE statements, one after another, in a xxx_extra_info_populate procedure.

 
        -- Do this......
        ALTER VIEW prep_cus.v_product_item_sku_extra_info AS
            SELECT 
                i.id as product_item_sku_id
                nav_it.extra_info_column_1,
                nav_it.extra_info_column_2,
                nav_it.extra_info_column_3
            FROM rep.items i
            JOIN nav.Item nav_it on nav_it.[No_] = i.item_no    

        -- Instead of      
    --===================================== *INSIDE procedure prep_cus.product_item_sku_extra_info* =====================================
            UPDATE iei set extra_info_column_1 = nav_it.extra_info_column_1
            FROM rep.items i
            JOIN prep_cus.product_item_sku_extra_info pisei on pisei.product_item_sku_id = i.id
            JOIN nav.Item nav_it on nav_it.[No_] = i.item_no  

    ---------------------------

            UPDATE iei set extra_info_column_2 = nav_it.extra_info_column_2
            FROM rep.items i
            JOIN prep_cus.product_item_sku_extra_info pisei on pisei.product_item_sku_id = i.id
            JOIN nav.Item nav_it on nav_it.[No_] = i.item_no  

    ---------------------------

            UPDATE iei set extra_info_column_3 = nav_it.extra_info_column_3
            FROM rep.items i
            JOIN prep_cus.product_item_sku_extra_info pisei on pisei.product_item_sku_id = i.id
            JOIN nav.Item nav_it on nav_it.[No_] = i.item_no  
  • Replace the MERGE procedures and/or populate procedures for Item Extra Info in the data transfer by executing the appropriate stg_element. This is typically done in procedures cus.data_customization and stg_prod.all_ready_to_prod respectively
        -- ***** PROCEDURE cus.data_customization *****

        --EXEC cus.item_extra_info_populate @l -- commented out in upgrade to v6.2. Using stg_elements instead
        EXEC core.execute_stg_element @stg_element_name = 'prep_cus_product_item_sku_extra_info', @batch_id = @lId --MERGES from prep_cus.v_product_item_sku_extra_info to prep_cus.product_item_sku_extra_info
        EXEC core.execute_stg_element @stg_element_name = 'prep_cus_product_item_sku_extra_info_populate', @batch_id = @lId --Populates the columns in prep_cus.product_item_sku_extra_info that are not included in prep_cus.v_product_item_sku_extra_info
        -- ***** PROCEDURE stg_prod.all_ready_to_prod *****

	    --EXEC [stg_prod].[item_extra_info_ready_to_prod] @lId -- commented out in upgrade to v6.2. Using stg_elements instead      
        EXEC core.execute_stg_element @stg_element_name = 'prod_product_item_sku_extra_info', @batch_id = @lId --MERGES from prep_cus.product_item_sku_extra_info to core.product_item_sku_extra_info on Prod
        --Make sure stg_element 'prod_product_item_sku_extra_info' in table core.stg_element
  • On prod, the table dbo.item_extra_info is to be replaced with core.product_item_sku_extra_info in all procedures that use it.

  • The references in tables dbo.mbe_tables, dbo.order_view_tables, and dbo.data_element_ref_tables to dbo.item_extra_info should use core.product_item_sku_extra_info instead.

    The primary key in dbo.item_extra_info is named item_id, while in core.product_item_sku_extra_info it is id. Make sure that columns mbe_tables.item_id_field_name, order_view_tables.join_column and data_element_ref_tables.ref_table_id_column all point to the right column name.

  • Recreate the order view after changing dbo.order_view_tables to make sure the view references core.product_item_sku_extra_info rather than dbo.item_extra_info:

	EXEC dbo.order_view_recreate
  • To adjust the Advanced Filters in reports, the column filter_txt in dbo.report_filters needs to be updated to use core.product_item_sku_extra_info instead of dbo.item_extra_info. You can do so with a query like this:
	UPDATE report_filters set filter_txt = REPLACE(filter_txt ,'item_extra_info.','core.product_item_sku_extra_info.') FROM report_filters
  • The column sql_query in dbo.mbe_reports needs to be updated, which is done by running the dbo.save_mbe_report_query procedure for all reports. You can use the following cursor to execute (or print EXEC command) for all MBE reports:
    	DECLARE
        @report_id   INT
       ,@user_id INT;
    
        DECLARE report_cursor CURSOR FAST_FORWARD READ_ONLY FOR
        SELECT id AS report_id, [user_id] AS [user_id] FROM dbo.mbe_reports;
    
        OPEN report_cursor;
    
        FETCH NEXT FROM report_cursor
        INTO
            @report_id
            ,@user_id;
    
            
        WHILE @@FETCH_STATUS = 0
        BEGIN
    
            PRINT 'EXEC [dbo].[save_mbe_report_query] @report_id = ' + CAST(@report_id AS NVARCHAR(50)) + ', @user_id = ' + CAST(@user_id AS NVARCHAR(50))
            
            --EXEC [dbo].[save_mbe_report_query] @report_id = @report_id, @report_name = @report_name, @user_id = @user_id
    
            FETCH NEXT FROM report_cursor
            INTO
                @report_id
            ,@user_id;
        END;
    
        CLOSE report_cursor;
        DEALLOCATE report_cursor;

Following the update

  • Uninstall the AGR System Monitor application and install it again. See instructions for installation here.
  • Schedule the daily/weekly/monthly job through the Task Scheduler and disable the old jobs scheduled through the SQL Agent. Further instructions on the Task Scheduler can be found here.

6.1

  • Make sure the customer is running SQL2016 or newer. See this stack post for more detail: https://stackoverflow.com/c/agrdynamics/questions/1127
  • The order logic has been refactored quite a bit so many procedures there are changing. If you have customised the order logic then those changes will likely be overwritten
  • All updates for dataseries are now going through the [dbo].[de_update_batch] procedure.

6.0

  • In this version all translation files and the settings from config file on the web server have been moved into the settings table. Therefor, when you upgrade to 6.0.0 you need to set these settings again. This is either done in the dbo.settings table on the prod database or in the UI when you log in as a system user. Most settings are self*explanatory except for the navigation bar. If you have custom workspaces that have been added into the navigation bar you need to do that again and you can find examples in the demo system here: http://192.168.101.126:32793/.

  • Versions 5.4.1 and 6.0.0 can be run in parallel (on the same database) but it can cause extra complications so only do this if this is necessary for a particular customer. If the systems are run in parallel, please go and enable this setting in version 6.0.0.

  • Some custom update procedure in plans/workspaces may require a rewrite since the planner is sending the input differently then before. This will mainly affect the merchandising modules but there are some customers that have custom logic after they enter a number in plans. Please test all custom procedures listed in the de_actions_sps table to make sure this isn’t affecting you.

5.4.1

  • Upgrade first PROD and then STG, because upgrade on STG depends on new table in PROD.

  • You need to update the demand formula and remove the reserved data element. Reserved is no longer considered as part of the demand in the order logic but is treated the same
    way as stock or undelivered.

  • Everything being mapped into dbo.demands needs to be moved elsewhere as that table is being removed. A new reserved table has been added but everything else would need a custom table.

  • element_id columns has been removed from forecast, safety stock and histories tables – if there are any customizations that are using these columns, they need be adjusted

  • The database has been cleaned up quite a bit – please remember to back everything up in case you are relying on any of these
    o All tables/procedures from the first version of the planner have been removed
    o All the old mbe filters tables have been removed
    o Many other outdated tables/procedures/views have been removed

5.4.0

The feature demand from stores (dfs), formerly known as known demand, is officially added. These actions are not dfs specific but for EVERYONE.

  • Always start with a backup of the prod and stage databases., especially for this version

  • Then after the backup the first thing you do when upgrading is to check the audit database. Does it contain the tables on prod, like settings and data_elements. Does it look for
    changes in structure in both stage and prod databases? Upgrade it before you start with 5.4.0.

  • There are changes to the v_bom_demand so if you do not use BOM you can just upgrade the view, if you use BOM you need to merge them, the new dataseries is the bom data.

  • There is a column that is being added on item_details called demand_setup. It is set to 1 if you want demand from stores to be used instead of forecast, in the demand series in data_elements and the column needs to be populated accordingly.

    • –In the view rep.v_items_details these three columns should be added
    • –I am assuming the locations table is in the join CAST(NULL AS DATE) AS [forecast_to_date], CAST(NULL AS DATE) AS [forecast_from_date], **case when loc.location_type = ‘warehouse’ then 1 else 0 end as demand_setup ** With demand from stores 0 as demand_setup **without demand from stores
    • Run the update on stage: exec [rep].[item_details_stg_to_ready]
    • Run the update to prod: exec [stg_prod].[item_details_ready_to_prod]
  • You need to update the demand series formula manually since it’s likely this has been customised in some way
    update data_elements set value_field_calculated = ‘COALESCE([planner_baseline],case when item_details.demand_setup = 1 then ISNULL([demand_from_stores],0) else ISNULL([forecast],0) end) +ISNULL([reserved],0)+ISNULL([bom_demand],0)+ISNULL([planner_promotions],0)’ where name = ‘demand’id = 17

  • The table order_line_data_snapshots is beeing phased out, you need to check dependencies and then rename the table to order_line_data_snapshots_Archive. So that if there is still something depending on it it HAS to break. New table is order_line_calc_results and should be populated in the upgrade with the old data (no history should be lost).

    • If there is anything depending on the table it needs to be redirected to [dbo].[order_line_calc_results] which is replacing the old snapshots table
    • Column names that have changed: forc_for_order_period  demand_total undelivered  undelivered_arrived
    • Then we rename the table to archive for storage sp_rename ‘order_line_data_snapshots’, ‘order_line_data_snapshots_Archive’;
  • If you want dfs to be enabled there is a setting on prod (demand_from_stores). You set it to true and set the days if you want (default is 365).

  • If you had known demand setup at any older version it should be removed.

  • The function dbo.fn_mbe_sql_flat is replaced by the procedure dbo.mbe_qb_sql_flat, so if there is any custom code that uses dbo.fn_mbe_sql_flat it needs to be updated.

  • There are changes to job.all_daily_run, so when upgrading you have to merge yours with the new one. BOM is removed and the order of things is changed.

5.3.2

Web site Please make sure to delete/rename the web folder so that the installer will include new config files. If you run the installer without removing the web folder no config files will be replaced and in this version.

Database If the customer has custom workspaces/plans that do not have workspace type “Sales Plan” the following tables need to be populated:
* features * de_workspace_features Also, the config file needs to be configured so that a link to this workspace will appear somewhere in the navigation since now the workspaces that are not of the type “Sales Plan” will not appear under Plans

5.3.1

Web site Please make sure to delete/rename the web folder so that the installer will include new config files. If you run the installer without removing the web folder no config files will be replaced and in this version. There are many changes to these files and the system won’t work if you don’t have the updated web.config file. API logging will be automatically turned on after the upgrade, more info here.

User Management * Now the settings of which features are enabled have been moved to the features table in the database (it’s not in the config file as it used to be) * By default, after the upgrade the planner and counting module are disabled so for clients that are using these features you need to go and enable them in the feature table * We have a new system user (system, password deleteme) that cannot be deleted and is intended for AGR Consultants only and this user will always have access to user management and the development mode that enables us to config advanced settings within the planner
* All users that have already been created in the system will be assigned to the purchaser role but you will need to manually add the planner role to users that should have
access to the planner.

Databases Only run the stg upgrade script if you want to show the stock history on the dashboard. You will also need to modify the data_element table to point the dashboard stock data element to a new pre*calculated stock history table called histories_stock_dashboard. Further documentation can be found here.

5.3.0

Dynamic compression This can be set up on the server to boost performance in the client. The downside is that it will cost a little more CPU on the server side, but that’s just an option that must be weighed each time depending on the resources available. To set up dynamic compression follow these steps:

If it’s not installed, open the Server Manager. Go to the Dashboard, then click “Add roles and features”. Click Next through each page of the wizard, until you get to the Server Roles page. Locate the Dynamic Content Compression role under Web Server (IIS), Web Server, Performance and tick the checkbox. Continue to the Confirmation page and click Install.

Once the role is installed go back to the Compression module page in IIS Manager and ensure that both dynamic compression and static compression are enabled. Note that you need to do this at the Server level not the Website level.

Compression is now set up, but we still need to add the mimeTypes that we want to compress. Go to the Configuration Editor.

Select system.webServer/httpCompression from the Section dropdown list 

and click on the edit button next to dynamicTypes:

Add entries for the mimeTypes “application/json;charset=utf-8” and “application/json” with Enabled set to True.

Remember to press Apply on the changes after closing the mimeTypes window:

Now you must reset the IIS for the changes to be applied. You can go to command line and do “iisreset”. To confirm that compression is now on, open AGR5 in your web browser and check the response headers under the Network tab in the developer tools. Content-Encoding should be gzip:

5.2.3

  • If you have started using planner, all plans/views will be lost. If the planner has been customized in any way please read this memo

  • New Item Group Tables

  • With the data series planner that was released in version 5.2.0 there came two new tables item_group_columns/item_group_details that store the same information as the old item_group table. From version 5.2.3 the standard AGR5 solution only uses the new tables.

  • If you are installing 5.2.3 from scratch please only map the new tables (item_group_columns/details) and make sure the item table refers to the id in the item_group_columns table. We have not created a standard version of the rep views for these tables on stage-ing so after you map it please commit them to Kallinn or send them to me so that they can be a part of the next version

  • If you are upgrading to 5.2.3 from older versions you only need to change the data mapping if the customer has the Sales Planner.

5.2.2

  • There is going to be an automatic backup of the tables data_elements, data_element_ref_columns, data_element_ref_tables, chart_elements and dashboard_elements . Please go through if there are any custom setup in the tables and enter it in the tables after the update.
  • Run the following to start automatically tracking changes to setup/config tables in agr5 prod database a. EXEC initial_load_table_content_tracking @auditdb_name = ‘NAMEOFAUDITDB

5.2.1

Web service

  • Edit the web.config file. Note: it is not possible to reuse the old one when updating to 5.2.1. (now there are three places where you need to update db and server info)

Enable SQL Service Broker Run the following command on prod: ALTER DATABASE agr5_prod SET SINGLE_USER WITH ROLLBACK IMMEDIATE GO ALTER DATABASE agr5_prod SET ENABLE_BROKER GO ALTER DATABASE agr5_prod SET MULTI_USER GO SELECT is_broker_enabled FROM sys.databases WHERE name = ‘agr5_prod’;

Install AGR Forecast Listening Service
Run the AGR5 Installer (AGR5Installer.exe) on the Server where the web service is located

  • Enter the name of the Database Server and the AGR5 production database:
  • SQL Server Name: The instance of the SQL server where the AGR5 db is to be installed. It can either be an ip address or the machine name. If using a named instance, the syntax is the following: MACHINE_NAME_OR_IP\INSTANCE_NAME.
  • Production Database Name: The name of the AGR5 production database, recommended name is agr5_prod.
  • Web site base url: It is important that this contains the full URL including http or https and the port number, this is used by the forecast service to trigger forecasts in the web client.
  • Port number: To be able to create the web site in the installer a port number is needed, if using the default port please enter 80.
  • User name: The user that has been configured in section 3. The domain needs to be included.
  • Password: The password for the user.

Once all values have been entered please click Next:

Go to Services and check if the ForecastListeningService is started, if not click start.

Test if you can run new forecasts through item card and by creating orders. If you have any problems look at the trouble shoot section in the technical manual.

  • There is a new config file (\web\assets\config\config.js) to enable features and set the locale (it’s not done in the client anymore). By default, the planner and dashboard are not enabled and the locale is set to en_gb. Please review these settings.
  • This file replaces the old features.js file which is no longer in use
  • There is going to be an automatic backup of two tables data_elements and chart_elements. Please go through if there are any custom setup in the tables and enter it in the tables after the update.

5.2.0

Web service

  • Edit the web.config file. Note it is not possible to reuse the old one when updating to 5.2.0.

Database

  • There is going to be an automatic backup of two tables data_elements and chart_elements. Please go through if there are any custom setup in the tables and enter it in the tables after the update.
  • Edit the api_route setting in settings table with the web service path of AGR5. If the WS and SQL are not on the same box it has to be the external path to the WS (not localhost localhost:8080 or Ip number:8080
  • Now forecast is not run automatically every month. Please create a monthly job that executes the following
  • EXEC [job].[run_forecasts_load_locations]
  • Go through every report manually and check if they are using the Days In Stock field (both in columns or filters) and replace it with the new Days cover (committed) column. Don’t forget to look at the reports the user have set to only be visible to them. After you have done this delete the old column from the mbe_columns table.

Download release files

Go to http://releases.local/, download the release you want to install, and unzip the file.


Install a DB patch or update script

Run AGR patch scripts

When patching navigate to db_scripts -> patches and run the patch scripts, both for the prod and stg database.

Run AGR upgrade scripts

When updating navigate to db_scripts -> agr_prod/agr_stg

  • Apply the database changes by running each update file (stg,prod,audit - if that applies) from the current version to the version you are upgrading to. Example (5.4.1 to 6.2-db0)
    • agr_prod_update_from_5.4.1-db6_to_6.0-db3.sql and agr_stg_update_from_5.4.1-db6_to_6.0-db3.sql
    • agr_prod_update_from_6.0-db3_to_6.1-db6.sql and agr_stg_update_from_6.0-db3_to_6.1-db6.sql
    • agr_prod_update_from_6.1-db6_to_6.2-db0.sql and agr_stg_update_from_6.1-db6_to_6.2-db0.sql
  • Remember to look for “ALTER” in the upgrade scripts to make sure no customer default values are being overwritten with new settings.


Install a Client patch or update

Backup Web folder

  • It is a good habit to back up the AGR Web folder before updating or patching
    • You can locate the web folder by launching the IIS Manager and click on Sites -> AGR_Web -> Explore

Run the AGR installer

  • Execute the AGRInstaller_6.x-rx.exe installer. Check the components that have changed between the releases. You can see version numbers for an individual component on the release page. If the “Client version” has changed select the “Web Client” component and if the “Server/API version” has changed select both “Web API” and “AppService” components.
  • When updating to a new major release (e.g. 6.1 to 6.2) it is necessary to select the “DB Migration” component.

create

Once all necessary components have been selected, click Next

Select the location of the AGR folder.

create

Fill out all input fields

create

All the settings are required for the installation to be successful:

  • SQL Server Name: The instance of the SQL server where the AGR database is to be installed. It can either be an IP address or the machine name. If using a named instance the syntax is the following : MACHINE_NAME_OR_IP\INSTANCE_NAME.
  • Production Database Name: The name of the AGR production database, recommended name is agr_prod. This must match the prod name that’s already been created in the previous step.
  • AppService - User Name: The user running the AppService
    • The user must have the “Log on as a batch job” and “Log on as a Service” rights.
  • AppService - Password: The password for the AppService user
    • Make sure that the password does not expire
  • AppService - Port: The port number for the AppService. You need to change this number if you want to have more than one instance of the AppService on the server.

Click Next

If “Skip user validation” is left unchecked the installer will try to validate the IIS and AppService user.

create

Click “Install”

If the “DB Migration” component was selected this dialog will appear. The “DB Migration” will connect to the database and migrate all the workspace configs to a newer version. Let this run through and once completed close the window.

create

If you see “Completed” in the installer window then the installation has been completed

create

Linked server problems after update

If you encounter problems with linked server connections after an update, please check the linked server section in the Installation Guide