How to: Get field values from dynamically loaded tables

If you need field values from different tables at runtime within the same function, then there is a nice solution:

 

// variables
tableType, Option : CompInfo,PayTerm
result : ARRAY [5] OF Variant 

OnRun()
LoadDynamicallyFieldValues(tableType::CompInfo, result);
MESSAGE(FORMAT(result[1])+', '+FORMAT(result[2])+', '+FORMAT(result[3]));
LoadDynamicallyFieldValues(tableType::PayTerm, result);
MESSAGE(FORMAT(result[1])+', '+FORMAT(result[2])+', '+FORMAT(result[3]));

LoadDynamicallyFieldValues(tableType : 'CompInfo,PayTerm';VAR result : ARRAY [5] OF Variant)
CASE tableType of
 tableType::CompInfo:
 begin
  compInfo.GET;
  recID := compInfo.RECORDID;
  GetFieldValue(recID,'Name',result[1]);
  GetFieldValue(recID,'Address',result[2]);
  GetFieldValue(recID,'City',result[3]);
 END;
 tableType::PayTerm:
 BEGIN
  payTerm.FINDFIRST;
  recID := payTerm.RECORDID;
  GetFieldValue(recID,'Code',result[2]);
  GetFieldValue(recID,'Discount %',result[3]);
  GetFieldValue(recID,'Description',result[1]);
 END;
END;

GetFieldValue(recID : RecordID;fieldName : Text;VAR fieldValue : Variant)
recRef.GET(recID);
field.SETRANGE(FieldName,fieldName);
field.SETRANGE(TableNo,recID.TABLENO);
IF field.FINDFIRST THEN BEGIN
 fieldRef := recRef.FIELD(field."No.");
 fieldValue := fieldRef.VALUE;
END;

cheers

 

Get next object version number

Every developer has sometimes the issue: What is my next version number for the new object ? If there is a couple of developers working in the some database, you’ll need a kind of a version control.

I’ve developed a page, which calculates the next version number for a defined version prefix. The version sytax is: <PREFIX><Main No.>.<2-digit Sub No.>.

Let’s start with a list of pages with different version list, version ARCH1.00 to ARCH1.06. It does not matter, if the version list of an object contains more than one value.

pic2

After running the page set the version prefix, here ARCH, and push action “Get next version”. We get then the new version: ARCH1.07.

pic1

You can download the page here.

 

Export Nav Objects by Code

Exporting locked nav objects can be a problem when importung in target database. it’s not that easy to unlock them in the target database. So for that you can export nav objects by code and check Lock status before exporting.

Create a new report, add dataitem Object. set report to processingonly.

pic1

select dataitem Object, set property ReqFilterFields to Type,ID.
set request page: add field Path (Text).

pic2

add following code to trigger OnOpenPage:

Object.SETRANGE(Type,Object.Type::Table);
Path := 'c:\temp';

add following code to report trigger OnPreReport():

finsql := 'C:\Program Files (x86)\Microsoft Dynamics NAV\100\RoleTailored Client\finsql.exe';
IF NOT FILE.EXISTS(finsql) THEN
  ERROR('finsql not found');

additional add that code to trigger Object – OnAfterGetRecord():

IF Object.Locked THEN BEGIN
  Message(FORMAT(Object.Type)+'-'+FORMAT(Object.ID)+' is locked.');
  // alternatively unlock object, then try again.
  // Object.Locked := FALSE;
  // Object.MODIFY;
end ELSE begin
  arguments := 'command=exportobjects,file=%1,servername=%2,database=%3,filter="Type=%4;ID=%5",ntauthentication=1';
  arguments := STRSUBSTNO(arguments,Path+'\'+FORMAT(Object.Type)+'-'+FORMAT(Object.ID)+
'.fob','localhost','Cronus',Object.Type,Object.ID);
  Process.Start(finsql,arguments);
  result := result + FORMAT(Object.Type)+'-'+FORMAT(Object.ID)+'\';
END;

to trigger Object – OnPostDataItem():

MESSAGE(result);

Global Variables:

Process DotNet System.Diagnostics.Process.'System, Version=2.0.0.0, Culture=neutral, PublicKeyToken=b77a5c561934e089'
arguments Text
finsql Text
Path Text
result Text

cheers

How to: Get record count for all tables

In newer Nav versions the tables overview in development environment/database information has gone. So how to get that list back ?

Let’s got to SSMS (Sql Server Management Studio).
There select the database, for which you want that information.
Right click on database -> Reports -> Standard Reports -> Disk usage by Top Tables

enter image description here

There we are ! 😉

 

How to work with big item descriptions

Sometimes it’s needed to save very long descriptions, but in Nav text fields can have only 250 characters. Additional you may want to search in these long text values. You can add a couple of these text fields to save long texts or save the text in text files and add them to the item. But what about searching? Not that easy.
An other option to save long texts is the usage of blob fields. For that option i developed a solution.

First add a new field “Description 3” to table Item, type BLOB, subtype Memo.
Then edit page Item Card, add a global variable Desc3Txt of type text with no length. Add the variable as new field to the item card, Editable=False, MultiLine=Yes.
Add following code to trigger OnAfterGetRecord in item card page:

// InStr | InStream
CALCFIELDS("Description 3");
IF "Description 3".HASVALUE THEN BEGIN
  "Description 3".CREATEINSTREAM(InStr);
  InStr.READ(Desc3Txt);
END;

Add to trigger Desc3Txt – OnAssistEdit()

// OutStr | OutStream 
// EditCtrl | DotNet | Archer.TextEdit.'Archer.TextEdit, Version=1.0.0.0, Culture=neutral, PublicKeyToken=1465b259ee2284cb' 
CLEAR(EditCtrl);
EditCtrl := EditCtrl.TextEdit;
EditCtrl.Load(Desc3Txt);
EditCtrl.ShowDialog;
Desc3Txt := EditCtrl.Save;
EditCtrl.Close;
CLEAR(EditCtrl);

"Description 3".CREATEOUTSTREAM(OutStr);
OutStr.WRITE(Desc3Txt);
MODIFY;
CurrPage.UPDATE;

itse-1

Page item card with new multiline text field “Description 3” and Assist Button.

itse-2

Clicking on Assist Button starts the TextEdit Control and loads the current text. After changing the text and closing the text control, you are asked, if you want to change the text.

Now we need the opportunity to search within the new text/blob field. For that we need a new page Item Search. That new field cannot be searched using the standard search function.

Create a new page with objectid 50000, name Item Search. Add a group under the contentarea, add a global text variable SearchString, add a new field under the group with SearchString as SourceExpr. Add another group, add a new line with type part. Later we set the value for property PagePartID.

itse-3

To show the search result we need another page: type listpart, objectid 50001, name Item Search Result. As source of the new page we need a new table: objectid 50000, name Item Search Result.

itse-4

The new page 50001:

itse-5

Properties: Editable=False, SourceTable=50001, SourceTableTemporary=Yes.

Now add global function SetData(SearchFilter : Text) to the new page. Add following code to the new function:

// local variables
// Item, Record, Item 
// ItemSearchResultLine, Record, Item Search Result 
// InStr, InStream 
// Desc3Txt, Text 
// LineNo, Integer 

DELETEALL;

LineNo := 10;
Item.FINDSET;
REPEAT
  Item.CALCFIELDS("Description 3");
  IF Item."Description 3".HASVALUE THEN BEGIN
    Item."Description 3".CREATEINSTREAM(InStr);
    InStr.READTEXT(Desc3Txt);
    IF STRPOS(LOWERCASE(Desc3Txt),LOWERCASE(SearchFilter)) > 0 THEN BEGIN
      "Line No." := LineNo;
      "Item No." := Item."No.";
      Description := Item.Description;
      "Description 2" := Item."Description 2";
      "Description 3" := COPYSTR(Desc3Txt,1,250); // first 250 chars
      INSERT(FALSE);
      LineNo += 10;
    END;
  END;
UNTIL Item.NEXT = 0;

CurrPage.UPDATE(FALSE);

Now you can set the property PagePartID in the part line in page 50000 to 50001.

For calling the search function we need a Search button in page 50000.
Add following code to trigger Search – OnAction()
CurrPage.ItemSearchResultLines.PAGE.SetData(SearchString);

itse-6

The new Item Search Page with a search result.

You can download the TextEdit Control here.

Followup:
You could simplify the solution by:
* Search page: Use only page 50001, add a second group at the top with field SearchString. So page 50000 is not needed.
* Page Item Card: remove the textedit control and the according code, set field Desc3Txt to editable, add the code to fill the blob field “description 3” using outstream to trigger Desc3Txt-OnValidate.

cheers

 

Mass data import

There was an issue with importing all UK post codes from a csv file, about 3m post codes. Importing using rapidstart services (excel import) can cause buffer overflow messages. excel itself has row/size limitations. Increasing MaxNoOfXMLRecordsToSend in config file ClientUserSettings.config from default value 5000 to e.g. 20000 is no problem and can help. Also changing MaxUploadSize in server config file CustomSettings.config is an option (also available via nav service admin console). Better choice for mass data import are dataports (older nav versions) and xmlports.

Another option is to develope a report, which imports the file contents and loops through the lines. quite simple, no memory issues.

create a new report, add following code to trigger OnPreReport():

OnPreReport()
// variables
 PostCode, Record, Post Code
 file, File
 fileName, Text, 250
 line, Text, 1024
 dlg, Dialog
 idx, Integer
 txtValue, Text, 100
// code
 PostCode.DELETEALL(FALSE);
 COMMIT;
 dlg.OPEN('#1###### #2######'); // show progress dialog
 idx := 1;

// downloaded post code file from https://www.doogal.co.uk/PostcodeDownloads.php
// as test file, size: 500k, 2.1m lines, some of them contain obsolete post codes
 fileName := 'c:\temp\England postcodes.csv';

 file.WRITEMODE := FALSE;
 file.TEXTMODE := TRUE;
 file.OPEN(fileName);
 file.READ(line); // skip header line
 WHILE file.READ(line) > 0 DO BEGIN
   // skip obsolete post codes: 2. value = No
   IF SELECTSTR(2,line) = 'Yes' THEN BEGIN 
     PostCode.Code := SELECTSTR(1,line);
     PostCode.VALIDATE(City,GetValue(SELECTSTR(15,line)));
     PostCode.County := GetValue(SELECTSTR(8,line));
     PostCode."Country/Region Code" := 'GB';
     PostCode.INSERT;
     dlg.UPDATE(1,idx);
     dlg.UPDATE(2,PostCode.Code);
     idx += 1;
   END;
 END;

 file.CLOSE;
 dlg.CLOSE;

 MESSAGE(FORMAT(idx) + ' post codes imported.');

LOCAL GetValue(txtValue : Text[100]) : Text
 txtValue := DELCHR(txtValue,'<','"'); // remove leading "  
 txtValue := DELCHR(txtValue,'>','"'); // remove trailing "
 IF STRLEN(txtValue) > 30 THEN // fields City,County are Text[30]
   txtValue := COPYSTR(txtValue,1,30); // cut text, leading 30 chars

 EXIT(txtValue);

report runs with 1.5m valid lines/records about 5 min.

cheers

ReplaceString and a new SelectString

With function ConvertStr you can replace substrings in strings. The original substring and the replacement have to have the same length. So what to do, if the length is different? Here is a solution.

ReplaceString(String : Text[250];OrigSubStr : Text[100];ReplSubStr : Text[100]) : Text[250]
// StartPos : Integer
StartPos := STRPOS(String,OrigSubStr);
WHILE StartPos > 0 DO BEGIN
  String := DELSTR(String,StartPos) + ReplSubStr + COPYSTR(String,StartPos + STRLEN(OrigSubStr));
  StartPos := STRPOS(String,OrigSubStr);
END;
EXIT(String);

SelectStr is a nice function to get text values out of a text line, simply giving the number/position of the value within the line, e.g. from ‘this,was,a,cool,thing’. here ‘was’ has position 2, so selectstr(2,line) would give us value ‘was’. SelectStr expects a comma as delimiter. Not so good, if you also have comma letters in the values (e.g. decimal values in europe). Would be fine to have a kind of SelectString function, where you can define the delimiter. hm … let’s try something.

SelectString(Number : Integer;String : Text[250];Delimiter : Char) : Text[100]
// i : Integer
// EndPos : Integer
// SubString : Text[100]
i := 1; // i : Integer
WHILE i  0 THEN BEGIN
    SubString := COPYSTR(String,1,EndPos-1);
    String := DELSTR(String,1,EndPos);
  END ELSE
    SubString := String;
END;
EXIT(SubString);

To test that new function we use that:

// line : Text[250]
// i : Integer
line := 'ab;cd;ef;cd';
for i := 1 to 4 do
  MESSAGE(SelectString(i,line,';'));

cheers

 

Convert Tab delimited strings

I had an issue to process a csv file with TAB delimited text lines. The existing code expected text lines with ; as delimiter.

Character TAB has ASCII Code 9.

So i wrote following code:

ConvertTabString(line : Text[250]) : Text[250]
ch := 9; // of type Char
line := CONVERTSTR(line,FORMAT(ch),';');
exit(line);

test code:

// set a line value or load the line from a text file using file.open
line := 'ab cd  ef  cd'; 
line := ConvertTabString(line);
MESSAGE(line);

result: 'ab;cd;ef;cd'

an interesting thing is, that this code does not work for older nav 2009 builds. maybe a problem with the encoding. for that case i developed a second version.

// text file test.txt contains 1 text line 'ab cd  ef  cd'. with TAB as delimiter.
// file : FILE
// instr : InStream
file.OPEN('c:\temp\test.txt');
file.CREATEINSTREAM(instr);
instr.readtext(line,maxstrlen(line));
line := CONVERTSTR(line,FORMAT(ch),';');
message(line);
file.close;

This can also be used to export the converted lines to a new text file, which can then be imported via dataport/xmlport.

cheers

Textual Data Export by Configuration

An often by customers wanted feature is to get a data export to a text/csv file, which can then be edited in excel or text editor.

Yes, you can use Rapidstart Services. But with that feature you can only export data in excel format, not in text format. That’s ok for editing in Excel and reimport the changed data back to NAV. For data exchange scenarios you always need text format. An other point is, that it’s quite often said, Rapidstart Services are not that easy to use for end users and has a couple of bugs. I also read quite often that this feature is not recommended at all. So what else can a customer do? Contact the NAV partner, who then shall develope a new xmlport, report or codeunit to do that job. That’s the usual way.

The german localised version of Dynamics Nav contains a really nice feature, simply called “Data Exports”. You can find it in menu /Departments/Administration/Application Setup/Financial Management/General/Data Exports. This feature is mainly used to export business data for auditing purposes according the GDPdU (Process for data access and testability of digital documents).

With the feature “Data Exports” there is an additional way for these kind of issues, it is possible to export data to csv files without developing! Great thing. So let’s have a look, how we can use that.

dx1

First we need at least one definition group. Then we define the according record definition (Button “Record Definitions”).

dx2

Here we need values for Code, Description and Export Path. In that case i want to export Sales Orders. “DTD File Name” is a mandatory field. I checked the code. The file is not processed, using different kinds of dtd files does not change anything in the resulting export files. The name of the dtd file is simply written to the file index.xml, which contains the datastructure. To get the thing run we create an empty text file, call it empty.dtd and add only one line of text:

Save the text file and click on Button Import in menu tab “DTD File”, select file empty.dtd and click ok. After that the file is imported and the file’s name is set to column “DTD File Name”.

Now let’s define the table and the fields. For that select the record definition and click on “Record Source”.

dx3

In the header area we set table no. to 36 (Sales Header) and the Key No. to 1 (Primary Key). Optional you can set the period field no., here to 19 (Order Date). In Column Table Filter you can set filters, if you do not want to export all records. Here i set the field to “Sales Header: Document Type=Order”, because i only want to export Sales Orders.

In the Fields Area we add the export fields by clicking on the Add Button. The key fields should be in first place. With MoveUp and MoveDown you can change the position of the fields. That’s all.

You can, if you wish, add more tables in the header area, you can set relationships, e.g. between Sales Header and Sales Line. But for now let’s run the thing. First we click Validate and get “The data export record source validated correctly.”, means all is ok.

So, after closing the page “Data Export Records Source” we are back in page “Data Export Records Definitions”. Here we click on Button Export in menu tab “Process”. Report 11015 “Export Business Data” is started:

dx4

Here set start and end date. These fields are mandatory and are applied to the period field of the header line, in that case to field “Order Date”. After execution we get some files in the export path, subfolder SALESORDER.

dx5

File empty.dtd is simply copied, file index.xml contains the data structure:

dx6

and file SalesHeader.txt contains the exported data (csv format):

dx7

There we are!
We got a csv export file without any development, only by configuration!

This can, as told before, extended by adding more lines in the header area of page “record source”:

dx8

Here you can see 2 header lines “Sales Header” and “Sales Line”, which is indented and has a defined relationship shown in page relationship in the bottom right corner of the above screenshot. In that case you get 2 export files, one per table.

Important: If you are interested in that feature, you could download the german localised version. But … for that you need the according license!
Exporting the nav objects as text export and renumbering them to the 50000s object range is possible, but illegal! So if you like that kind of feature, try to redevelope that functionality. Do not copy these objects to your database, do not reuse the code. If you want to do that, please contact your Nav partner or Microsoft for license clarifcation. You could suggest to add that functionality to the W1 version.

For more details about that feature follow this.

cheers

Dynamics NAV -Show table relations

In posting “The big picture” i published an upgraded version of Kamil Sacek’s tool to show all the table relations. This version needs a developer license. So it’s not useable for end users without developer license.

For that kind of end users i developed a simplified solution, which does not need a developer license. But … it does not show all table relations, but the main ones. To get the flowfield- and lookup relations the developer license version is needed.

aw10-nav2015-1

You can download the nav objects here.

cheers