Showing posts with label Apex. Show all posts
Showing posts with label Apex. Show all posts

Thursday, June 1, 2017

Trigger + Process + Workflow = Recursion Hell?

As your Salesforce org matures, chances are you'll find yourself trying to untangle Apex triggers, processes created with Process Builder, and workflow rules. Specifically when field updates are involved, predicting the outcome from the order of operations can be a pita, especially because the documentation still leaves room for questions in cases involving recursion.

Follow the two rules below for a reduced-pain implementation.

If you're updating fields on a in Process Builder, and you've marked the "Recursion" checkbox, know that every time the record is updated by the process, before and after trigger events will fire again. This is also true for updates made by a process invoked as an action by another process.


So all in all, remember that a single Apex trigger could run 14 (fourteen) times for a single DML operation! If you're mixing triggers with processes and workflow rules, make very sure your business logic in triggers will survive recursion.

Monday, May 1, 2017

getValue() getter method vs. { get; } shorthand

Salesforce's { get; set; } syntax has been around for a long time and is a time-tested, reliable way to define properties in Apex. But after testing its usability and limitations in Spring '17,  I've decided that explicitly declaring traditional getter and setter methods should be preferred over using the convenient { get; set; } syntax.

The primary reason is that the only way to expose a property in a custom Apex class for use with Lightning Components is to use the @AuraEnabled annotation, and this annotation only works on a traditional getter method such as String getName().

The secondary reason is that the developer also has the option to either call the getter or access the private field directly from other methods in the class, which is not possible when using { get; set; }.

Tuesday, February 2, 2016

Get Full DateTime Format (GMT) in Apex

To get the full DateTime format in GMT such that it's compatible with Apex methods like JSON.deserialize(),  the most accurate method is to call DateTime.formatGmt().


For comparison, below are some other alternatives for generating similar String values for a DateTime object.

Sunday, May 31, 2015

Who Sees What: Record Access Via Roles (Corrected)

I'm admittedly a bit disappointed in the Who Sees What: Record Access Via Roles video. When I first read about the Who Sees What series, I thought it was awesome that salesforce.com decided to produce visual aids on the topic of security. As an alternative to the Security Implementation Guide, which boasts over 100 pages of official documentation on Salesforce security, I expected the videos to demystify the flexible and nuanced security controls available to system administrators.

However, after watching the Record Access Via Roles video, I feel that almost 70% of the content within is either misleading or simply inaccurate.

Note: I spent significant time staging test cases in an org before deciding to write this blog post, so please let me know if any part of my writing is technically wrong.

2:10 Three ways to open up access? Not quite...


In the AW Computing scenario, the presenter says that access to private Opportunity records can be opened up in one of three ways, quoted below:
  • "No access. In essence, this maintains the org-wide default of private. Users in this role would not be able to see opportunities that they do not own."
  • "View only. Users in a role can view opportunities regardless of ownership."
  • "View and edit. Users in a role can view and edit opportunities regardless of ownership."

This sounds good in concept, but as the video progresses to the demo portion to show how the three ways are actually implemented, the problem becomes clear. The presenter is actually misconstruing and wrongly explaining the Opportunity Access options.

In reality, the implicit access granted through the role hierarchy automatically solves the requirement presented in the video, and the Opportunity Access options are completely irrelevant to the hypothetical situation.

A more accurate explanation


Let's assume that the role hierarchy is set up as implied by the visual at 1:40 in the video.



Alan can see and edit whatever opportunities Karen and Phil can see and edit. The two reasons are that Alan is above Karen and Phil in the role hierarchy, and that OWD for the Opportunity object is configured to grant access using hierarchies (which as of Spring '15 you still cannot disable for standard objects). There are no more granular controls for records owned within a subordinate chain. If Karen can see a record, Alan can see that record. If Karen can edit a record, Alan can edit that record. Access via subordinates in the hierarchy is that simple.

So what do the Opportunity Access options do? Simply put, the options do exactly what the Role Edit page says they do.


Opportunity Access options have nothing to do with roles and subordinates. The selected option comes into play in situations, such as ones involving account teams, where a user from one branch in the role hiearchy owns an account, but a user from a different branch owns an opportunity for that account.

Try it yourself


Admittedly this will be really difficult if you don't have access to a sandbox org or a Partner Enterprise Edition org, but here's the idea.

Your role hierarchy looks something like the following:
  • CEO
    • SVP Products
      • Product Sales Manager
    • SVP Services
      • Services Sales Manager

Configure Opportunity Access for all roles in the hierarchy so that "users in this role cannot access opportunities that they do not own that are associated with accounts that they do own."

Set OWD for Opportunity to "Private", then do the following:
  1. Log in as a Product Sales Manager
  2. Create an account
  3. Create an opportunity
  4. Log in as a Services Sales Manager
  5. Verify that you cannot see the opportunity
  6. Create a new opportunity on the account owned by the Product Sales Manager
  7. Log in as the Product Sales Manager
  8. Verify that you cannot see the new opportunity created by the Services Sales Manager
  9. Log in as the administrator
  10. Change the Opportunity Access for the Product Sales Manager role so that "users in this role can view all opportunities associated with accounts that they own, regardless of who owns the opportunities."
  11. Log in as the Product Sales Manager
  12. Verify that you can now see the new opportunity created by the Services Sales Manager
  13. Verify that you you cannot edit that opportunity

Saturday, May 30, 2015

The Apex Ten Commandments (in Writing)

For anyone (like me) who couldn't find the slides to The Apex Ten Commandments recording referenced on the Architect Core Resources page, here's the written list:

  1. Thou shalt not put queries in for loops
  2. Thou shalt not put DML in for loops
  3. Thou shalt have a happy balance between clicks & code
  4. Thou shalt only put one trigger per object
  5. Thou shalt not put code in triggers other than calling methods and managing execution order
  6. Thou shalt utilize maps for queries wherever possible
  7. Thou shalt make use of relationships to reduce queries wherever possible
  8. Thou shalt aim for 100% test coverage
  9. Thou shalt write meaningful and useful tests
  10. Thou shalt limit future calls and use asynchronous code where possible

And I just have a couple of comments to add for color.

Comments on #7


I haven't tested this hypothesis yet, but... does this commandment still hold with large data volumes? Especially in the context of batch Apex? One certainty is that executing a single query like this is convenient for the developer. But when the query would return thousands of records that reference a small set of parent records, perhaps at larger data volumes a more efficient approach would be to split the query and leverage commandment #6 instead.

Comments on #10


The future annotation is slowly become obsolete with the introduction of the Queuable interface in Winter '15, although general guidelines for designing asynchronous automation still hold true.

Saturday, April 25, 2015

update vs. undelete in Apex Triggers

Based on the Implementing Triggers training module, there are only four DML operations for which triggers can be executed: insert, update, delete and undelete. But, when an undelete operation occurs, does that also count as an update operation, whereby the sObject.IsDeleted value is toggled?

To settle the matter for my own benefit, I create an Apex test to validate my assumptions. What I learned and confirmed were the following:

  • Only one of Trigger.isUpdate, Trigger.isDelete and Trigger.isUndelete will ever be true during trigger execution. This means that the three operations are indeed distinct and constitute different trigger contexts.
  • The ALL ROWS keyword is required to retrieve soft-deleted records that are in the Recycle Bin

Below are the trigger and its test class I used.

AccountTrigger.cls



AccountTriggerTest.cls


Monday, April 20, 2015

Salesforce Limits All Developers Should Know

As I was going through the Object-Oriented Programming in Apex training module, the section on "Limit Methods" reminded me that out of the myriad limits detailed in Salesforce Limits Quick Reference Guide, there are some that are more important than others. Which ones? How about the ones that have their own Limits methods...


To that end I went through the exercise myself and compiled a list of the Limits methods and what the respective limits are, using the helpful resources below.


This was a great learning experience for me, as I discovered many old limits have been deprecated, especially the ones previously affecting describe methods! It was also interesting to note that there is no limit on callouts per 24-hour period, or at least none that I could find.

Surprises with and without sharing in Apex

I'll admit that I never dug too deep into the with sharing and without sharing keywords. At a high level I felt that if I want to enforce security and visibility, I should use with sharing. Otherwise I should use without sharing, which I also assumed was the default.

The Object-Oriented Programming in Apex training module surprised me by telling me that CRUD permissions and field-level security ("FLS") are ignored both with sharing and without sharing! Also, system mode simply means that record-level read and edit privileges are ignored, since CRUD and FLS are always ignored.

This is contrary to what I'd inferred from the Apex Code Developer's Guide, which writes, "In system context, Apex code has access to all objects and fields— object permissions, field-level security, sharing rules aren’t applied for the current user." My interpretation of this statement was that on the flip side, when with sharing is applied, "object permissions, field-level security, sharing rules" would all be applied for the current user.

It was a bit hard for me to believe that Apex would not respect CRUD (if not FLS) permissions, so I created a Visualforce page to test this in my org. And it seemed to me that indeed, an Apex controller created using the with sharing keyword would allow a user without the Delete object permission to delete a record. Crazy... am I missing something?

Well! What's more interesting was that with a standard controller, CRUD permissions are always observed with actions like delete(), regardless of whether an extension class is defined with sharing or without sharing. For example, a button that invoked the StandardController.delete() action would automatically be hidden if a user didn't have the Delete object permission. Furthermore, if a custom action in the extension class invoked the standard controller's action, the custom action would also be subject to the user's CRUD permission, generating an "insufficient privileges" error.

So, I guess the way to enforce CRUD is to use standard controllers, which I don't think is always feasible, especially with mass actions.

Wednesday, April 1, 2015

Events Created By Apex Respects DST

For my own edification, I wanted to confirm that Apex in Salesforce is capable of automatically adjusting for DST based on the user's local time zone.

The scenario: As a user in the America/New_York time zone, when I create an event using Apex for July 4, 2015 (EDT) at 9:00 AM and another event for December 25, 2015 (EST) at 9:00 AM, I expect the following:

  • Both events should appear in the UI as starting at 9:00 AM on my calendar
  • The UTC start time for the July 4 event should be 13:00
  • The UTC start time for the December 25 event should be 14:00, which accounts for the end of Daylight Savings Time

The following code confirms the expected behavior:

Thursday, February 12, 2015

Queueable vs. @future throwdown!

At first blush, the new Queueable interface appears to supersede the old @future annotation in Apex, especially now that in Spring '15 you can chain a job to another job an unlimited number of times. Yes, that's right: unlimited.

So, what's the purpose of @future in this new age of Apex?

Let's start by comparing the well known @future limits with Queuable.

@future consideration vs. Queueable
Some governor limits are higher, such as SOQL query limits and heap size limits Some governor limits are higher than for synchronous Apex, such as heap size limits
Methods with the future annotation must be static methods Queueable implementations must be instantiated as objects before the the execute() instance method is called, leaving room for additional job context
Methods with the future annotation can only return a void type Queueable classes must implement public void execute(QueueableContext), which is how a job is initiated
The specified parameters must be primitive data types, arrays of primitive data types, or collections of primitive data types. Methods with the future annotation cannot take sObjects or objects as arguments. A Queueable object can be constructed with any type of parameter, stored as private member variables
Can make a callout to an external service
A future method can’t invoke another future method You can chain queueable jobs. You can add only one job from an executing job, which means that only one child job can exist for each parent job.
No more than 50 method calls per Apex invocation You can add up to 50 jobs to the queue with System.enqueueJob in a single transaction
The maximum number of future method invocations per a 24-hour period is 250,000 or the number of user licenses in your organization multiplied by 200, whichever is greater. This is an organization-wide limit and is shared with all asynchronous Apex: Batch Apex, Queueable Apex, scheduled Apex, and future methods. The licenses that count toward this limit are full Salesforce user licenses or Force.com App Subscription user licenses.

From the reverse side, what about known limits with Queuable?

Queueable consideration vs. @future
The execution of a queued job counts once against the shared limit for asynchronous Apex method executions.
You can add up to 50 jobs to the queue with System.enqueueJob in a single transaction. No more than 50 method calls per Apex invocation
No limit (except in DE and Trialforce orgs) is enforced on the depth of chained jobs, which means that you can chain one job to another job and repeat this process with each new child job to link it to a new child job. n/a (cannot chain @future methods)
When chaining jobs, you can add only one job from an executing job with System.enqueueJob, which means that only one child job can exist for each parent queueable job. Starting multiple child jobs from the same queueable job isn’t supported.
You can’t chain queueable jobs in an Apex test. Doing so results in an error. To avoid getting an error, you can check if Apex is running in test context by calling Test.isRunningTest() before chaining jobs.

The verdict: Implement Queueable as a standard approach, and only look to @future if for some reason Queueable gives you unexpected or undocumented problems.

Friday, November 21, 2014

Login with Facebook for a community with Visualforce

Adding an auth provider (e.g., Login with Facebook) to a Salesforce Communities instance is pretty easy.


The Login with ... button is pretty magical: It knows what community you're logging into, and it knows what URL you're trying to access upon authentication. Based on that information when you click the button, you're taken to the correct authentication URL based on the Auth Provider configuration in Salesforce.

But... for orgs that have customized their login pages with Visualforce, how do you add this button? Luckily, adding the same magical button to a custom Visualforce login page doesn't require you to manually reconstruct the OAuth authentication endpoint. Instead, you can just use the AuthConfiguration.getAuthProviderSsoUrl() method.

Simply put, you need just three things:
  • Your community's base URL
  • The relative path after the base URL to which the user should be taken upon successful authentication. This is also known as the startUrl.
  • The URL Suffix (a.k.a. DeveloperName) of your auth provider (e.g., Facebook) as configured in Salesforce

Once you have this information, all you need to do is create an action in your Apex controller that returns a PageReference object, constructed from the URL returned by AuthConfgiuration.getAuthProviderSsoUrl().


This same approach can be applied to other auth providers as well, such as Twitter or Google.

AuthConfiguration.getAuthProviderSsoUrl(String, String, String) explained

Curious about AuthConfiguration.getAuthProviderSsoUrl(String, String, String)? Here are some notes about the parameters that may not be apparent the first time you read the documentation.

String cUrl


You'll want this to match exactly what you see when you're looking at your community's Administration Settings. Don't add any extra trailing slashes!


String startUrl


This should be the URL relative to the value you put for cUrl, including the leading forward-slash ('/'). For example, if you want to land the user at the Home tab, startUrl should be set to "/home/home.jsp" (again, note the inclusion of the leading forward-slash).

Think about it this way: If you concatenate cUrl and startUrl, exactly as written, you should end up with a valid URL to the desired page or resource within your community.

String developerName


This is the DeveloperName field from the AuthProvider object. Or, you can find the value by looking at the URL Suffix field on the Auth Provider detail page.


You can find this value dynamically by querying the AuthProvider object.

Monday, September 22, 2014

Your request cannot be processed at this time. The site administrator has been alerted.

In setting up and testing self-registration for a new community built on Salesforce Communities, you may encounter this confounding error: "Your request cannot be processed at this time. The site administrator has been alerted."

"Alerted" in the above statement refers to an email alert that went out to the site administrator. But who is the "site administrator"? You can find the email address used for this notification by locating the Site Guest User for your community's Force.com site.
  1. Open Setup > Manage Communities
  2. Click Force.com for the community in question
  3. Click Edit
  4. Change Site Contact to the user who should receive notifications when errors arise with self-registration. Make sure the user's email address is valid.


For sandboxes, make sure your Access to Send Email is set to "All email", under Setup > Email Administration > Deliverability. (Thanks Lynnet!)


Below are a few common reasons why you may be encountering the problem:
  • The community is not published. In case you're working with a community that is offline or in Preview status, you need to publish the community before self-registration will work.
  • The Account Owner does not have a User Role assigned. If you're creating a new Account record on the fly, especially in B2C situations, you need to make sure you assign a default account owner that also has a User Role value. Any role will do, and you can use either a workflow rule or Apex to perform the assignment.
  • The site guest user does not have Create and Read permissions on the Account and Contact objects. Edit Public Access Settings for the community's Force.com site to grant these permissions, along with field permissions for any fields that are included on the self-registration form. Note that by default the site's profile will not have these permissions.
  • Self-registration is not enabled for the community. Go to Manage Communities, click Edit and make sure the Login settings show that self-registration is enabled.

For more details on setting up a community for self-registration, please refer to the Getting Started with Communities guide.

Wednesday, August 20, 2014

Close recurring tasks using Apex

At least one developer had reported encountering an error when trying to automate updates to Task records using Apex: System.UnexpectedException: Apex Task trigger cannot handle batch operations on recurring tasks. This error occurs even in simple scenarios, such as simply closing any open tasks on a close opportunity.

While I agree with what I imagine to be general consensus that this error seems crazy, luckily there appears to be a workaround: Update a custom flag using Apex, and then use workflow rules to handle the dirty work of updating restricted standard field(s).

While I've only tested this approach on the scenario below, I think it will be applicable to solving other similar problems with updating Task records using Apex.

Sample solution to close recurring tasks using Apex


The anonymous Apex code below illustrates the solution.


And there's nothing special about the checkbox field, the workflow rule or the field update. The idea is that when the checkbox is checked, the workflow rule fires and performs the field update.


Friday, August 8, 2014

Created By and Last Modified By not set before insert or update

After some frustrating troubleshooting of what I thought would be simple code today, I learned something unexpected: Created By and Last Modified By are not populated or set in the before context of DML operations. This means that before insert, CreatedById, CreatedDate, LastModifiedById and LastModifiedDate are all empty. And before update, LastModifiedById and LastModifiedDate will reflect the previous person to modify the record, not the current user (which would be yourself if you are the one editing the record).

While this was counter-intuitive at first, I came to realize that there's no need to rely on those fields in the context of Apex triggers. All I really need is UserInfo.getUserId() and DateTime.now() to get the time.

Just like that... I learned something new today.

Monday, June 23, 2014

コンポーネントからページコントローラメソッドを呼ぶこと

初めて日本語でブログ投稿ですから、いろいろな文法的と文化的のミスしてしまって申し訳ありません。下記のコンテンツは「Controller Component Communication」の翻案です。

ある時にapex:Componentからページコントローラメソッドを呼ぶ必要が出て来ます。この場合で、ページコントローラをapex:attributeでコンポーネントコントローラに渡せば、以上の呼ぶことが出来てなります。

標準コントローラお使ってサンプルコードが提供されてお参照してください。サンプルコードはLeadCustomSaveページの作成方法です。


LeadCustomSave.page



ControllerHandleExtension.cls



CustomSave.component



CustomSaveController.cls


Monday, May 26, 2014

Last Event or Task Date vs. Last Activity Date

Salesforce gives users the ability to run reports on the Last Activity Date for all objects, out of the box. But what if users wanted to distinguish between Last Event Date vs. Last Task Date? Or break up the data by other criteria?

Luckily, the Declarative Rollups for Lookups package (thanks to Andy Fawcett) can solve this problem, with only a few minor tweaks to work around a Salesforce limitation: field ActivityDate does not support aggregate operator MAX.

The general idea is to create two custom fields on the object of your choice: Last Event Date and Last Task Date. Follow these up with two Lookup Rollup Summaries, and you can now easily split your activity data by Tasks vs. Events. The end result? Reports like the following.


Follow this tutorial and give these new fields a try! Please add a comment below to let me know whether this tip works for you.

Friday, February 21, 2014

"Illegally" Reparenting Children in Master-Detail Relationship with Apex

If you have a Master-Detail field in Salesforce that's not reparentable, what do you think will happen if you use Apex to change the field value on a record? I had expected to see an exception thrown. But after a few hours of head-scratching, I discovered I was wrong.

If you try to change the value of a Master-Detail field that is not configured so that "child records can be reparented," you will not see an error. Salesforce will simply leave the existing value in place as if you never attempted to make the change in the first place.

The interesting implication here is that developers should write Apex tests to validate expected an Master-Detail configuration. Admins and developers can work together more reliably, if you explicitly write an Apex test that confirms whether a Master-Detail field can be reparented. This way, when an admin changes the field configuration and unintentionally breaks something, a red flag will be raised by the test method, pointing you clearly to the unexpected configuration change.

To see what happens if you reparent a child record when that's not "allowed", try this:
  1. Create a custom object labeled "Alias" (Alias__c), with the record name configured as a text field
  2. Create a Master-Detail field on Alias labeled "Account" (Account__c)
  3. Configure the Account field on Alias so that child records cannot be reparented to other parent records after they are created
  4. Create an AliasUtilTest class, as seen in this paste, with a test method to assert that changing the Account field on an alias is a futile effort
  5. Run the test method in AliasUtilTest

Friday, December 20, 2013

Testing Apex @future Methods in Salesforce

To write functional Apex unit tests for methods having the future annotation, all you need to do is use Test.startTest() and Test.stopTest(). The general structure of your testMethod will look like the following.

Whether it's scheduled Apex or @future Apex, calling Test.stopTest() will cause those methods to run in sequence in your test method. Test.stopTest() then allows you to validate the results of @future Apex using normal means.

Tuesday, December 3, 2013

Deploying Destructive Changes Using Workbench

Sometimes, especially in the case of custom Apex or Visualforce, a Salesforce admin or developer needs to delete components from an org. However, Salesforce's user-friendly change sets feature does not allow admins to propagate component deletions. The only semi-automated alternative to performing these deletions, especially in production orgs, is to leverage the metadata API.

Fortunately, with the availability of Workbench on Developer Force, the steps required for deploying destructive changes (that delete components) are pretty simple:

  1. Create a package.xml file
  2. Create a destructiveChanges.xml file
  3. Bundle the two files together in a .zip file
  4. Deploy the .zip package using Workbench

As you can see from this sample .zip package, the files are fairly simple and straightforward. Multiple types of metadata can be removed with a single package.

The exact steps for deploying using Workbench 29.0.1 are:
  1. Open the migration menu, then click Deploy
  2. Click Browse... and select the .zip package file
  3. Mark the "Rollback On Error" checkbox
  4. Mark the "Single Package" checkbox
  5. Mark the "Run All Tests" checkbox
  6. Click Next
  7. Review the deployment options, then click Deploy

The results, successful or otherwise, will be displayed in Workbench for you to review once the deployment process is complete.