Using DX without dependencies hell?

I am thinking of leveraging DX but don’t want to go through the dependencies management hell and the whole packaging mess.

Basically, I am thinking of just store code in the repo using SFDX and use that for Scratch Org but simply convert to Mdapi for production & sandbox deployment.

What I am gaining:

  • Can leverage scratch org since my code is in SFDX.
  • New code format is easier to work with since it breaks down the .object file.
  • Still happy with the happy soup. No need to spend years on breaking down stuff, manage dependencies, worry about duplicate metadata, package versioning, package versions dependencies, transitive dependencies, over-engineer dependencies injection and all the baggage that comes with DX. Honestly, I don’t see we can ever justify that the benefit worths the effort. I don’t want to spend 1.5 years to get nowhere like what the customer did here: https://www.youtube.com/watch?v=MY2_AfjtBp8

  • Not have to deal with multiple CI processes, one for package and another for unpackaged, since you can’t package everything anyway.

What I am loosing:
– Deploy everything at once so deployment is slower compared to the package installation approach.

I don’t want to blindly follow whatever Salesforce people said, coz they have to sell what they made, whether it is good or bad. It’s the customers who pay. Appreciate any honest opinion that doesn’t come from a Salesforce evangelists or MVP.

Also, please answer in the context of a complex org with years of metadata, not DreamHouse app.

Below was what discussed in the video, which is exactly what I am not going to do.

enter image description hereenter image description here

Answer

Basically, I am thinking of just store code in the repo using SFDX and use that for Scratch Org but simply convert to Mdapi for production & sandbox deployment.

As of Winter ’19, this is no longer necessary. You can use the new force:source:deploy to package up your DX-compatible file tree and deploy it without the hassle of force:mdapi:convert. This also basically eliminates the need to have a classic mdapi format, unless you need it for an IDE or some other reason. Regardless, it’s still a sound strategy overall if you have a complicated setup, like we do.

Can leverage scratch org since my code is in SFDX.

Technically, you could do that with the force:mdapi:deploy command, which works in both scratch orgs and other orgs. There’s no specific need to use force:source:push if you don’t want to.

Still happy with the happy soup. No need to spend years on breaking down stuff, manage dependencies, worry about duplicate metadata, package versioning, package versions dependencies, transitive dependencies, over-engineer dependencies injection and all the baggage that comes with DX. Honestly, I don’t see we can ever justify that the benefit worths the effort.

Note that packages are completely optional, and in fact, I’d recommend you stay away from them if you need more than about 5-10 or so, as they quickly start to make a mess of things. For new customers with no pre-existing configuration, I would recommend packages to. May as well start off on the right foot.

For small-to-medium size orgs, I’d recommend researching if packages are viable or not. For large organizations, like ours, packaging is still pretty much a pipe dream. We might eventually one day start building packages, but many of our features have incredibly complicated dependencies. We can select a single item and end up finding hundreds of dependent items.

Not have to deal with multiple CI processes, one for package and another for unpackaged, since you can’t package everything anyway.

It depends on how you do your CI. For example, our CI makes diffs between the source and destination, so we never do a full deployment anyways. In our case, packages, then, would be redundant, because we’re basically already doing what DX packaging offers. However, even if you just package your core system library, you might still have a considerable savings in deployment time that might justify the CI complexity required.

But, there’s always a trade off. Complexity for deployment time. You have to consider how valuable your time is, because you’re going to end up spending it one way or the other. If you have a setup where deployments take only a few minutes anyways, packaging probably isn’t worth the complexity.

One place I worked at had a 45 minute deployment time. If we had DX then, and we could have reduced our complexity such that deployments only took 15 minutes, we would have spent the time to build packages. Imagine a deployment failing 4 times at 45 minutes each (5 total deployments). I had that happen to me, and I didn’t leave the office until 2am as a result (deployments started at 10pm).

Deploy everything at once so deployment is slower compared to the package installation approach.

Yes, and no. You’re really not losing much in most cases, because the duplicate items will be a no-op (they don’t really change). In many orgs, the actual deployment time is overwhelmingly unit tests, which packages are not going to help you skip anyways.


To address more specific concerns…

No need to spend years on breaking down stuff,

That’s the Salesforce “organic migration” approach. And, as far as I can tell, it might be viable, if you even knew where to start. Most orgs, I wager, have a ton of deeply nested dependencies you simply can’t break easily, so you’d end up with either a large core and lots of small side packages, or simply give up and put everything into one big package, which defeats the purpose of it.

manage dependencies,

Especially since it’s all completely manual. If we had a tool to automate the dependency resolution, it might be … not as bad. Starting from ground zero, the dependencies would be manageable. For existing bases, especially as large as ours, packaging would be a nightmare. We don’t use packaging for this reason.

worry about duplicate metadata,

That’s actually more of a non-worry, because the deployments always just seem to go okay regardless of duplicates, so long as they don’t conflict with each other. Honestly, I was surprised by how DX seemed to do the right thing consistently, as long as I didn’t do anything too obviously broken.

package versioning,

The system kind of takes care of the versioning for you, so it’s not that bad of an issue.

package versions dependencies,

Generally a non-issue, because DX does a decent job of managing them for you. It’s rare that you’d have to deal with this directly once everything is set up.

transitive dependencies,

I’m not sure how this applies in a metadata context. I’d love a concrete example of how this might be a problem.

over-engineer dependencies injection …

I agree, one should not DI just to fulfill a packaging requirement. This is the point of dependent packages, though. However, I do understand that there are situations that could arise where A depends on B, but B depends on A. The typical solution would be to move the common dependencies to C, and A and B would both depend on C. This leads us back to an earlier statement that you’d probably end up with a huge core library and lots of small packages that depend on it, which defeats the purpose of packaging.


There’s one other potentially damaging loss: namespaces. Using packaging namespaces, you can actually help eliminate a lot of the duplicate metadata problems by isolating them and being able to refer to them uniquely. This is comparable to languages like C and C# that have had these features forever. If you find yourself prefixing classes all the time (e.g. Account_Extension, Account_TriggerHandler, etc), using packages might make sense for you, and help isolate code.


I wouldn’t dismiss packaging outright (even we intend to eventually use it, if we can get the features we asked for), but also don’t feel bad if you decide not to use it. I feel like you just might want to do some more research before you conclusively say “no, I’ll never do this.” (your question already reads like foregone conclusion). A lot of the features that exist are promising.

You can even do pseudo-packaging for right now; set up a bunch of paths to sort your metadata into, but don’t actually build the packages. This might save you on deployment times in the future. And you don’t need to do it all at once, either, just every time you’re in a particular area, start picking out pieces. Do it as part of the normal development cycle. You’ll hardly even notice the difference. As a bonus, if you decide to package, you’ve already done the hard part, and if not, you can use your source tree as is.


tl;dr

Ultimately, the choice is yours. One size does not, and cannot, fit all. DX is designed to fit a certain demographic of clients, but it certainly cannot accommodate everyone. And it’s still missing features that are critical to using it as advertised. If you don’t care for packaging, you’re not forced to use it. If you want to use the old mdapi format, or the new DX format, you have that choice (especially with the new force:source:deploy command).

Please note that DX isn’t particularly a selling point, as salesforce.com isn’t making any money off this, at least not in the direct sense. They’re genuinely trying to make development easier and more manageable, like other modern languages; it’s a direct response to the numerous complaints from ISVs, large clients, and community developers at large. DX is a tool, like a hammer or a screwdriver. It’s up to you to figure out how you’re going to use it, or if it’s even the right choice.

Attribution
Source : Link , Question Author : codeinthecloud , Answer Author : sfdcfox

Leave a Comment