For long, we (as a partner community) have been asking for longer field lengths. And this time, Microsoft seems to have delivered: about 860 field lengths were changed. Mostly from Text50 to Text100, but also from Text30 to Text50 and so on. For your convenience, I created a csv on my “CALAnalysis”-repo on github that lists all of them.
With this change, there are some caveats that I can think of – and there might be even more then I list here below.
First of all: what if a Business Central database contains C/AL customizations or AL extensions or apps on AppSource? If a database is being upgraded to the Spring Release, and in your code, you assign the field that is now Text100 to some other field in your solution that is probably still just Text50 – there is a danger for an overflow, which (I think) is a runtime error.
In case of extensions, in theory, code analysis should catch this. You should get “overflow” warnings when you compile your code against symbols from BC version 14. For an app on AppSource, ISVs should already have caught it this way in their build against the “next version” (insider) – if they had set that up of course.
For “Per Tenant Extensions”, you typically don’t set that up, so I’d strongly advise to start checking all these extensions, as unexpected errors might happen once people start using these lengths.. . I only can imagine what kind of situations that would cause – it could turn into a support-nightmare.. .
For customizations (I mean ANY solutions that are still in C/AL), you are quite “bleeped” (sorry for my French). Because we don’t have code analysis there -no way any compiler is going to help us. So if you upgrade your C/AL to the spring release, I’d highly advise you that you really take this into account. I will suggest a few ways to handle this going further in this blogpost.. .
Another caveat I can think of is reports. All of a sudden an item description can be 100 characters. I don’t know any report that fits 100 chars as a description – or I am missing something. And do realize, we are talking about 860 fields here, not “just” the Item Description. So potentially any kind of text-field on a report can run into not being fully displayed.. . Don’t know about you, but I know my customers will not appreciate this.
What can you do?
The most obvious thing to do is compile your code against the latest release. With this, for any kind of extension, and especially the ones that are on Business Central SaaS (since this will get upgraded automatically), you should set up a scheduled build against the “next release”. You can easily do that with DevOps in combination with Docker. And in these cases, it’s highly valuable, as you would catch many of these possible overflows like that.
Though, I don’t know about TRANSFERFIELDS and may be other statements that assigns values to variables. Please pay enough attention to this.
The way to solve this is to also change the field lengths of your fields. Not by cropping any values and losing data in the tables where it would end up obviously.
Roll Back Microsoft’s changes
This probably sounds like the most ridiculous option your can do. I mean: change 860 field lengths back to their original lengths? Are you kidding me? Well, first of all, in my opinion, it’s a valid option for OnPrem C/AL solutions (obviously not for any kind of extension/app solution). It’s actually what we did – at least as a temporary way to go. The reason is twofold: first of all, this is our last C/AL release of our product. Next release will be full-extension. If we would do any migration to our new solution, then there is no problem. Second reason: this was the only way we could 100% identify all the changes we needed to do to make the whole solution stable again (frankly, we weren’t waiting for bigger lengths). Because after quite some attempts, there was no way for us to identify where we assign these changed fields, and where it would eventually end up in any kind of custom fields that might cause a problem.
May be a third reason of the two I was going to mention is: We could automate this.
This is the simple script that we used to roll back these changes (my colleague created this one, which was twice as fast as my version ;-)): https://github.com/waldo1001/Waldo.Model.Tools/blob/master/ChangeObjects/RestoreFieldLengths.ps1
Change all Text-datatypes to Text100
If we can do the above, we would also be able to identify all Text-fields with a length lower than 100 – and we would be able to change them to 100, right? It’s actually fairly easy to do, but in my opinion, not 100% safe (as some Text-fields were changed to 260 or even 2048). And obviously, it wouldn’t solve the report-problem … .
So we didn’t go for this option, and I can’t get you the script, but at least you have the building blocks to do it ;-).
How did I analyze this?
Well, as you might have read, I used this code analysis tool that we created internally (and blogged about in this post). The script to compare the fieldlengths and list the differences in datatype, you can find on my github here: https://github.com/waldo1001/Waldo.Model.Tools/blob/master/Analyze/CompareFieldLengths.ps1 .
So – we survived the snap – hope you will as well ;-).
With a tool like Statical Prism you can search for all places where a certain field is being used, including transferfields. If you create a text export from the objects from the cside objects you can look at this and identify the places that need to be changed.
For the extensions this tool is not the solution to find all places where a certain field will be used. I can think of a way how you could do that but it’s very ugly one so I’ll better keep that for myself.:-)
Agree and disagree.
The thing is .. I’d like to automate it ;-). That way, I’m sure.
But at this point, with prism or any other tool, it’s still a manual process – what if I assign to a variable, which gets assigned through a big CASE – then assigned to another variable, that ends up in a custom field. It’s a chain of statements that might eventually end up in a field. Any kind of tool just ends upt with direct assignments but no insight in the chain.
I was looking into creating a recursive function, but I gave up.. . It just isn’t worth it, in my opinion. It wouldn’t solve the report-part either anyway… .
As for the ‘report-problem’: the field length got increased, but not the data in it. So reports will be fine with the current data. It starts to be a problem when users decide to take advantage of the increased field length and put more data in it than will fit on the report. However, that’s not a problem on day 1 I would say. And end-users are responsible for their own data in the first place.
For example, there are also reports that don’t fit a 20-character item no. while the field could potentially hold that number of characters.
That’s a “special” way to look at things – customers are responsible for their data, so if we provide a field of 100, and it only displays a part of it – it’s their problem?
In our case, it was a problem on Day 0, when testing the bigger lengths. On reports, AND in business logic. We weren’t even able to simply post. Try to fix that – may be not an Day 1, but on Day 10 while multiple customers face the same problem on different parts of the application…
Or even on AppSource, where customers get upgraded automatically, and starting to face all these problems at the same time (know that “overflow” is “just” a warning – which I just noticed doesn’t seem to work anymore in the first place).
You shouldn’t minimize this .. it’s a huge issue that everyone should pay immediate attention to, no matter on which environment.
Just my 0.02€
I didn’t want to minimize the problem for posting or business scenarios. Just saying that on day 1 data on reports are not the problem. Just because the data didn’t get increased, but the field length.
As for reports, I can imagine customers need to decide if they want to wrap lines or not. Some would like that, others maybe not. If you don’t like it, then don’t use the 100 available characters because it doesn’t fit. We all have been doing modifications like smaller font size, smaller page margins, etc, in order to fit data on reports that otherwise would wrap. Which was always a per-customer task, because every customer has a different report layout. All I’m saying is that reports will be fine with the current data. Report layouts are different from business logic.
Again, not trying to minimize, and custom fields lengths must be increased as well because the total solution should support the increased field lengths from day 1. Which can be a huge task as you point out. Just trying to understand why it is a problem (for reports) with existing data that fit the old field length.
Agree .. the reports-problem doesn’t create a huge problem, just a frustrated customer (may be) ;-). The rest though …
Partners with products usually make sure that the max field length fits the reports – at least that’s what we do. It saves “Per Customer” customization cost.. .
Apart from this I found out one other change with spring 19 update is that the control names are changed in standard objects…so the existing extension will not compatible with new release…and will have to put some extra efforts to solve that.
Very true …
Hi Waldo, what about in cloud? On prem is another world different from Cloud.
How we can solve this?
Thank you so much,
That error only Microsoft can solve, and you should follow up on that on github.
My 2 cents: as far as I know, there is no way to “force” in the cloud, so you can’t get that field out, nor can you change the name (because it’s a breaking change). So I’m actually interested what Microsoft’s response is going to be in this case…
Thank you so much for the quick reply. MCSFT told us that the last day to upgrade my customer is this12 of June….so….I dont know what to do…
Do you have any info to confirm me this date? Or at least what to do with this…