Subscribe

Appirio RSS Feed
Subscribe with Bloglines

Add to Google
Subscribe in NewsGator Online

Community

Appirio Technology Blog

Sunday, December 28, 2008

The Power of Force.com Sites

Michael McLaughlin

The announcement at Dreamforce 2008 of Force.com Sites was huge. Yes, it does allow organizations with minimal IT staff to consolidate their Salesforce.com instance along with their public-facing web presence (see the Cathedral Partners Sites-based public web).








But even cooler than that, it allows that same public-facing web presence to show content directly out of your Salesforce org! Of course, you could just pull "content" out of your org, but the power of Sites is being able to pull real-time data...without authenticating! Enough with the marketing pitch...what does that mean to you? The ability to read and write data in your org as a public user is huge. For example, you can present real-time dashboard-type data about the number of accounts you are servicing, an up-to-date product and pricing list, hot news items, or the latest campaign details. This kind of data would typically need to be extracted from Salesforce, digested, and reposted to your external site...a process that, including reviews and upload cycles, could take days or more. Now it's all instant!

OK, so Sites is a great tool. Here are some gotchas and areas to double check before publish your Sites site:

  • Ensure that you are not showing too much data

  • ---The ability to show real-time production data is a huge benefit, but it is also a huge liability. Ensure that the Public Sites User is locked down and only has access to the objects you want to publish.

  • Be sure all the correct switches are flipped

  • ---Nothing is more frustrating than going live with a dud site! The public setting on your controllers, images, and static resources needs to be True. The Cache and Expires parameters on your <apex:page> tag need to be appropriately set so users avoid stale data. Hint: the Expires parameter is in seconds. Set it appropriately. These parameters might seem useless (after all, why not always pull the latest data?) but they can be cost savers since Sites is priced on number of page views. If you can prevent users from costing you a page view every time they hit their back buttons, I'm sure your CFO would appreciate it!

  • Sites is built on standard Visualforce pages with Apex controllers. This means that test methods and coverage limits must be met before deploying.

  • Beg, borrow, and steal from your current webmaster all of his/her stylesheets, javascript, and images so your Sites pages are seamless from any static pages.

  • Furthermore, ensure that you have pointed your Sites URL at your domain. For example, out of the box, your Sites URL will be something like yournamehere.na1.force.com. To present a truly seamless look and feel (in case your users glance up at the URL), work with Salesforce and your web host to have the CNAME record point at your "friendly" URL.

  • Finally, polish up your error pages. When you enable Sites, you get several standard error pages for such typical errors as a 404 (page not found) and 501 (server error). Be sure to apply the same stylings to these pages as you did to the rest of your Sites implementation to make it look super-slick even when a user hits a boo-boo.


Sites isn't generally available for all orgs at the time of this writing. Be sure to hit this link to ask for Sites in your org. The general guesstimate for availability is summer 2009. Have fun and be safe!

Saturday, December 27, 2008

TinyURL POST API

I was exploring Twitter's use of the TinyURL utility and couldn't find any more information on their API from their website. After a small amount of Google searching I found the HTTP POST API for TinyURL.

So, a small example:

http://tinyurl.com/api-create.php?url=http://www.appirio.com/techblog

Returns the plain text:

http://tinyurl.com/7vzap5

Have fun.


Monday, December 1, 2008

Using Workflow to Update a Case When an Email-to-Case Message is Received

Andrea Giometti

We are going back to basics here, but recently we were looking to implement what seemed to be basic functionality and found ourselves jumping to complex solutions while overlooking the native functionality within Salesforce.  With the development tools now available in Salesforce such as Visualforce and Apex Code, customizations are seemingly limitless.  However, why reinvent the wheel if a solution already exists within Salesforce's standard functionality? 

The issue was that we needed to update the status of a case in Salesforce when an inbound email was received for an existing case via email-to-case.  It turns out that since the Spring '08 release, you can do this with standard email-to-case workflow.  When you enable email-to-case in your org, an object called Email Message is enabled for workflow rules. In that object is a field called 'Is Incoming' which is set to true for any inbound emails.  The key to the workflow is that you don't apply it to the Case object, you apply it to the Email Message object.  This will allow you to then do a field update on the Case object.  Similarly, you can use workflow rules when a case comment is received by applying the workflow rule to the Case Comment object.  This can be particularly helpful if you are using a Customer Portal and want to update a field on the Case object when a customer adds a comment to a case.

By implementing a simple workflow rule, we were able to add functionality that would automatically re-open a case if an email associated with a closed case is sent by the customer.   Others uses for this workflow rule could be to re-assign a case, say to a queue, when an email is received and the case meets certain criteria.  In addition to checking the Is Incoming field, you could also create a workflow rule that checks the status of the email message and makes changes to a case field based on that status.  Basically, once you discover the Email Message object is available in workflow, the possibilities are endless.

To create the workflow rule that updates a case field on an inbound email:

  1. Go to Setup | App Setup | Create | Workflow & Approvals | Workflow Rules and click New Rule.  
  2. Select Email Message as the Object the workflow rule applies to and click Next (note that Email Message will only be available if email-to-case has been enabled in the org). 
  3. Enter a name for the workflow rule and select when it should be evaluated.
  4. Enter the following criteria to enable the workflow rule to fire when an email is inbound:
    • 'Email Message: Is Incoming' equals 'True'
  5. Add additional criteria if you only want the workflow rule evaluated under certain circumstances such as Case: Closed equals True, or Case: Status does not contain Closed.
  6. Click Save & Next
  7. Click Add Workflow Action and select New Field Update
  8. Enter a name for the Field Update and then select the case field to update.

Friday, November 14, 2008

Learning Apex: Display Multi-Object Data in Tables Easily with Apex Dummy Classes

Will Supinski

Creating tables in Visualforce is easy. Provide a list of Objects to display and define various columns that access different items in each Object. Easy. Things become tricky when one needs to display data from more than one object in a table. To solve this problem one need only define a dummy class to hold all the relevant data. This article will detail such a process.

Let us begin by inspecting the syntax of a simple Visualforce page that displays a table:

<apex:page controller="TrainingController ">
   <apex:pageBlock title="Users">
      <apex:pageBlockTable value="{!Users}" var="item">
         <apex:column value=”item.Name”/>
      </apex:pageBlockTable>
   </apex:pageBlock>
</apex:page>

public class TrainingController
   {
      public User[] getUsers()
      {
          return [select Name, Id from User];
      }
}

The above code will print out all the names for users returned by getUsers() in a shiny new table. This is easy to do without any special technique.

Consider a slightly more complex situation. You are building a Learning Management System that associates Users with Transcripts and TrainingPaths. You create a Transcript and TrainingPath custom object that each have a reference to a User defined as Trainee__c. Now you want to display each trainee in a table with the associated TrainingPath name and Transcript percentComplete field. But, how can we display three different objects within the same table? This is the question answered through the creation and use of dummy objects.

An incorrect approach to solving this issue is to create apex methods that query the objects and then call these from individual columns. Unfortunately, life is not that easy as this solution is not scalable because the number of queries would be proportional to the number of entries in the table. As soon as the table grows the governor limits will be met and your page will fail to load.

A working solution is the creation of apex dummy classes. The idea of dummy classes is that we create an apex class with the sole purpose of providing access to more than one object. Check out the dummy class below:

public class TrainingPathDummy
{
   public Training_Path__c tp { get; set; }
   public Transcript__c transcript { get; set; }
   public User trainee { get; set; }
   public TrainingPathDummy(Training_Path__c tp1, Transcript__c trans1, User trainee1 )
   {
      tp = tp1;
      transcript = trans1;
      trainee = trainee1;
   }
}

This dummy class has a member variable for each of the data objects we want to display in our table. Notice that the constructor has a parameter for each of the class variables. These will be passed in from the controller so that no queries within the dummy class. A list of these TrainingPathDummy classes can be iterated over in the pageBlockTable and its member objects can be accessed in the table easily as seen below:

<apex:page controller="TrainingController ">
   <apex:pageBlock title="Users">
      <apex:pageBlockTable value="{!TrainingPathDummys}" var="dummy">
         <apex:column value=”dummy.trainee.Name”/>
         <apex:column value=”dummy.tp.Name”/>
         <apex:column value=”dummy.transcript.PercentComplete__c”/>
      </apex:pageBlockTable>
   </apex:pageBlock>
</apex:page>

The Controller class must do all the heavy lifting of querying the data and forming it into dummy classes. Populating the list of dummy classes only takes 3 queries regardless of the size of the table. Governor safe and mother approved!

public class TrainingController
{
   public User[] getUsers()
   {
      return [select Name, Id from User];
   }  
   public Transcript__c[] getTranscripts()
   {
      return [select Name, Id, PercentComplete__c from Transcript__c];
   }

   public TrainingPath__c[] getTrainingPaths()
   {
      return [select Name, Id from TrainingPath__c];
   }
   public TrainingPathDummy[] getTrainingPathDummys()
   {
      TrainingPathDummy[] result = new TrainingPathDummy[]();

      //query for all the data
      User[] allUsers = getUsers();
      Transcript__c allTranscripts = getTranscripts();
      TrainingPath__c allTPs = getTrainingPaths();

      //find all the related data and put into dummy class
      for(User u: allUsers)
      {

         //get the related Transcript
         Transcript__c curTrans;
         for(Transcript__c trans: allTranscripts)
         {
            if(trans.Trainee__c == u.id)
            {
               curTrans = trans;
               break;
            }
         }

         //get the related TrainingPath
         TrainingPath__c curTrainingPath;
         for(TrainingPath__c tp: allTPs)
         {
            if(tp.Trainee__c == u.id)
            {
               curTrainingPath = tp;
               break;
            }
          }

         //create the Dummy class and add to the result list
         result.add(new TrainingPathDummy(u, curTrainingPath, curTrans);
   }
   return result;
}

Using Dummy classes is a useful skill for displaying data logically while keeping the total number of queries low. Add this method to your developer toolbox today!

Tuesday, October 21, 2008

Using Client-Side Looping to Work within Salesforce.com Governor Limits

Chris Bruzzi

Repeat after me. The governor is our friend. It stops us from doing things we really shouldn't be doing, so in a way the governor makes us a better person. At least as far as SaaS development goes.

As you may already be too familar, Salesforce.com imposes limitations to ensure that their customers do not monopolize resources since they share a multi-tenant environment. These limits are called governors and are detailed in the Understanding Execution Governors and Limits section of the Apex Language Reference. If a script exceeds one of these limits, the associated governor issues a runtime exception and code execution is halted.
I am about to guide you through a simple example of using client-side looping in VisualForce to execute server-side Apex code that would otherwise have been unacceptable based on the governor limits.
Modifying your Apex
There are a number of situations when a solution like this might be helpful, but consider this situation; you want to move 10 million records from Source_Object__c to Target_Object__c via Apex. You would hit the governor limits on number of records retrieved via SOQL and the number of records processed via DML, just to name just a few.
Assuming there isn't already an autonumber field on Source_Object__c that could help us keep track of our progress processing the records, we'll first need to add a checkbox field to Source_Object__c called Processed__c.

We can then use that field in our SOQL query to ignore records already processed, and likewise set it to true as we process records. You would then need to modify your method with a few lines of code similar to what is in red below.


global class BatchProcessDemo {
webservice static void processItems() {
Integer queryLimit = (Limits.getLimitQueryRows() - Limits.getQueryRows()) / 2;
for (List<Source_Object__c> sourceItemList :[select Id, Color, Weight
from Source_Object__c
where Processed__c = false
limit :queryLimit ]) {
List<Target_Object__c> itemsToInsert = new List<Target_Object__c>();
for (Source_Object__c sourceItem : sourceItemList) {
sourceItem.Processed__c = true;
Target_Object__c targetItem = new Target_Object__c();
targetItem.Color__c = sourceItem.Color__c;
targetItem.Weight__c = sourceItem.Weight__c;
targetItem.Source_Object__c = sourceItem.Id;
if (Limits.getDMLRows() + itemsToInsert.size() + 1 >= Limits.getLimitDMLRows()) {
insert itemsToInsert;
}
itemsToInsert.add(targetItem);
}
update sourceItemList;
insert itemsToInsert;
}
}
}


Creating the Visualforce Page
As mentioned in a previous post by Frank and Kyle, make sure you have Development Mode enabled and then redirect your browser to http://server.salesforce.com/apex/BatchDemo to create your page. Click on Page Editor in the bottom left of the browser to open the Visualforce Editor. Add the following code between the <apex:page> tags to setup our form:

<apex:sectionHeader title="Demo"/>
<apex:form>
<apex:pageBlock title="Perform Batch Process">
<apex:panelGrid columns="2" id="theGrid">
<apex:outputLabel value="Max. # of Iterations"/>
<input type="text" value="1" name="iterations" id="iterations"/>
</apex:panelGrid>
</apex:pageBlock>
</apex:form>

You'll notice we use standard HTML input fields rather than Apex input fields since there is no VisualForce controller required. The fields will only be used on the client side via Javascript to batch our calls to Apex.
Add a <div> tag immediately after the </apex:panelGrid> tag to display progress during the batch processing.

<div id="progress" style="color: red"/>

After the <div> tag, add a button to allow us to kick off the processing.

<apex:pageBlockButtons >
<input type="button" onclick="batchProcess()" value="Start" class="btn"/>
</apex:pageBlockButtons>

Next, we'll need to define the batchProcess() method by adding the following code after the first <apex:page> tag.

<script language="javascript">
function batchProcess(){
var iterations = document.getElementById("iterations").value;
var progress = document.getElementById("progress");
progress.innerHTML = "Processing iteration 1 of " + iterations + " iterations.";
sforce.connection.sessionId = "{!$Api.Session_ID}"; //to avoid session timeout
for (i=1; i <= iterations; i++) {
progress.innerHTML = "Processing iteration " + i + " of " + iterations + " iterations.";
sforce.apex.execute("BatchProcessDemo","processItems ",{});
}
progress.innerHTML = "Completed processing " + iterations + " iterations!";
}
</script>

Click Save. Now you can click the Start button on your VisualForce page to perform the job in batches.

Thursday, October 16, 2008

Google Apps Auth Backend for Django

Tim Garthwaite

Google loves Python. In fact, Google's original web spider, which crawls the web to create its search index was written while Larry Page and Sergey Brin (the founders) were still graduate students at Stanford, and rumors abound that it went live written completely in Python. I learned in university that most of the Python code performed well enough that much of the code was still Python to that day (circa 2000), although much of it was highly optimized in platform-specific C. Moreover, Google's new Platform-as-a-Service (PaaS), AppEngine, which allows anyone in the world to host complete web applications "in the cloud" for free (heavy use will be charged at far below-market rates), currently supports only one language (you guessed it: Python). While Google has assured that they will release AppEngine SDKs for other languages, only Python is currently supported.

AppEngine, it can be argued, may not be ready for prime-time commercial or enterprise use, as it does not support SSL for all communication between the browser and servers. Authentication can be done safely by redirecting to a secure login page and returning with a token, but the token (and all your corporate data) would then be passed back and forth in plaintext from then on. Google has promised to add SSL support to AppEngine, but until they do, Appirio's Google Practice has begun recommending the full Django platform (on Apache or, heavens forbid, IIS) for internally developed applications, in anticipation that converting these web applications to AppEngine would be relatively painless.

The AppEngine Python SDK comes with much of the Django framework pre-installed, including its fantastic templating system. Also, the Object-Relational Mapping (ORM) system built into AppEngine is remarkably similar to the ORM that comes with Django, and the AppEngine authentication system is markably similar to its Django equivalent as well. These facts should make conversion from custom in-house Django applications to AppEngine in the future (and throwing out your pesky web servers, gaining the best features of the world's most robustly distributed compute in the world, in the process) relatively painless.

So let's say you wish to go ahead with creating Python/Django web applications in-house. Django comes with an authentication framework that allows for custom back-ends, meaning that you can test username/password combinations against an arbitrary back-end system, such as Active Directory or any other LDAP system, or even against users stored in a custom database. For one of Appirio's clients who is fully embracing the cloud, including Google Mail, Calendar, and Docs corporate-wide, it made the most sense for a certain application to authenticate against Google Apps itself using Google's Apps Provisioning API. Here's how I accomplished this.

First, you must create the back-end Python class. For example purposes, I have created a 'mymodule' directory (anywhere in my Python path) containing an empty __init__.py file (telling Python to treat this directory as a module) and the file django_backend.py. Of course, you must replace "mydomain.com" with your own domain, and as your Python code base grows, you should adhere to a more logical standard for where you place your libraries. It would make sense to think about this and begin now so you won't have to refactor your code. In my system, the class file is in the 'appirio.google' module. Here are the contents of this file:

from django.contrib.auth.models include User, check_password
from gdata.apps.service include AppsService
from gdata.docs.service include DocsService
DOMAIN = 'mydomain.com'
ADMIN_USERNAME = 'admin_user'
ADMIN_PASSWORD = 'p@s$w3rd'
class GoogleAppsBackend:
""" Authenticate against Google Apps """
def authenticate(self, username=None, password=None):
user = None
email = '%s@%s' % (username, DOMAIN)
admin_email = '%s@%s' % (ADMIN_USERNAME, DOMAIN)
try:
# Check user's password
gdocs = gdata.docs.service.DocsService()
gdocs.email = email
gdocs.password = password
gdocs.ProgrammaticLogin()
# Get the user object
gapps = AppsService(domain=DOMAIN)
gapps.ClientLogin(username=admin_email,
password=admin_password,
account_type='HOSTED', service='apps')
guser = gapps.RetreiveUser(username)
user = User.objects.get_or_create(username=username)
user.email = email
user.last_name = guser.name.family_name
user.first_name = guser.name.given_name
user.is_active = not guser.login.suspended == 'true'
user.is_superuser = guser.login.admin == 'true'
user.is_staff = user.is_superuser
user.save()
except:
pass

return user

def get_user(self, user_id):

user = None

try:

user = User.objects.get(pk=user_id)

except:

pass

return user

Let's briefly review this code. authenticate() uses the GData Python library to ensure the username and password match with the actual Google Apps account. Since you need an administrator account to use the Provisioning API, I chose an arbitrary user-accessible API (Google Docs) to verify the user's password. If the password doesn't match, an exception is thrown, None is returned, and the login fails. If it does match, we log in to the Provisioning API with admin credentials to get the Google Apps user object, guser. Then, using a built-in helper method, we attempt to get the Django User object with matching username, or create a new one. Either way, we take the opportunity to update the User object with data from Apps. get_user() is a required function (as we are creating a class to meet a "duck-type" interface, rather than inheritance). We simply return a Django User, if one exists, or None.

Finally, to enable this back-end, you must modify the site's settings.py file, ensuring 'django.contrib.auth' is included in INSTALLED_APPS, and adding 'mymodule.django_backend.GoogleAppsBackend' to AUTHENTICATION_BACKENDS. You can now test logging into your site as Google Apps users. If you have enabled 'django.contrib.admin', you can then login to your site's admin console and see that these users were automatically added into your Django auth system. You could also easily create a web page to list these users by passing 'users': User.objects.all() into a template and writing template code such as:

<ul>{%foreach user in users%}<li>{{user.email}}</li>{%endfor%}</ul>

We hope you find this code useful. Feel free to use any or all of it in your own Django web applications. If you do, please let us know in the comments!

Wednesday, October 8, 2008

Calendar Resource Management with the Google Data API

Matt Pruden

In many enterprises, there is no piece of real estate more scarce than an unoccupied conference room. With so much importance placed on conference rooms, their rigorous management is critical to a successful Google Apps deployment.

While Google Calendar offers a flexible system for reserving conference rooms, projectors, scooters, or any other shared resource, it does not provide a documented API for creating, updating, and deleting resources. Instead, you must manually manage resources through the Google Apps control panel. Manual management may work for a small number of resources but becomes unscalable when managing thousands.

However, creative developers can find just such a Google Data (GData) API for provisioning resources. In this post, we'll explore how to create, read, update, and delete calendar resources using GData through cURL, the commonly available command line HTTP client.

Discovering Calendar Resource support in GData.


Each type of entry in Google, whether a spreadsheet row, user account, or nickname has a collection URL. In true REST fashion, a GET request to the collection URL will return a list of entries. For example, an GET request to http://www.google.com/calendar/feeds/default/private/full will return a feed of calendar event entries. Likewise, a POST to this URL will add a new event entry to a calendar. So, to retrieve and create resources, we first need to discover the collection URL for calendar resources.

A calendar resource has many of the same characteristics as a user. For example, a calendar resource can be a meeting attendee and can be browsed by clicking "check guest and resource availability" in the Calendar user interface. Also, a calendar resource isn't tied to a particular user when it is created. It is reasonable to believe that managing calendar resources through the API might closely mimic managing users through the provisioning API.

In the provisioning API, the collection URL for user accounts looks like this: https://apps-apis.google.com/a/feeds/domain/user/2.0. What if we change user to resource resulting in a URL like this: https://apps-apis.google.com/a/feeds/domain/resource/2.0? The example below uses the cURL application to send a GET request to the new URL. For details on using cURL with GData, see Google's documentation.

curl -s -k --header "Authorization: GoogleLogin auth=DQAAAH4AA" https://apps-apis.google.com/a/feeds/mydomain.com/resource/2.0 | tidy -xml -indent -quiet
<?xml version="1.0" encoding="utf-8"?> <feed xmlns="http://www.w3.org/2005/Atom" xmlns:openSearch="http://a9.com/-/spec/opensearchrss/1.0/" xmlns:gCal="http://schemas.google.com/gCal/2005" xmlns:apps="http://schemas.google.com/apps/2006" xmlns:gd="http://schemas.google.com/g/2005"> <id>https://apps-apis.google.com/a/feeds/mydomain.com/resource/2.0</id> <updated>1970-01-01T00:00:00.000Z</updated> <category scheme="http://schemas.google.com/g/2005#kind" term="http://schemas.google.com/apps/2006#resource"/> <link rel="http://schemas.google.com/g/2005#feed" type="application/atom+xml" href="https://apps-apis.google.com/a/feeds/mydomain.com/resource/2.0"/> <link rel="http://schemas.google.com/g/2005#post" type="application/atom+xml" href="https://apps-apis.google.com/a/feeds/mydomain.com/resource/2.0"/> <link rel="self" type="application/atom+xml" href="https://apps-apis.google.com/a/feeds/mydomain.com/resource/2.0"/> <openSearch:startIndex>1</openSearch:startIndex> <entry> <id>https://apps-apis.google.com/a/feeds/mydomain.com/resource/2.0/-81411918824</id> <updated>1970-01-01T00:00:00.000Z</updated> <category scheme="http://schemas.google.com/g/2005#kind" term="http://schemas.google.com/apps/2006#resource"/> <title type="text">Bldg 3, room 201</title> <link rel="self" type="application/atom+xml" href="https://apps-apis.google.com/a/feeds/mydomain.com/resource/2.0/-81411918824"/> <link rel="edit" type="application/atom+xml" href="https://apps-apis.google.com/a/feeds/mydomain.com/resource/2.0/-81411918824"/> <gd:who valueString="Bldg 3, room 201" email="mydomain.com_2d3831343131393138383234@resource.calendar.google.com"> <gCal:resource id="-81411918824"/> </gd:who> </entry> </feed>

We've found the collection URL for calendar resources! Now, we just need to determine the XML schema for an individual resource. A hour of trial and error results in the following schema:

<?xml version='1.0' encoding='UTF-8'?> <ns0:entry xmlns:ns0="http://www.w3.org/2005/Atom"> <ns0:category scheme="http://schemas.google.com/g/2005#kind" term="http://schemas.google.com/apps/2006#resource" /> <ns1:who valueString="long name" xmlns:ns1="http://schemas.google.com/g/2005"> <ns2:resource id="short name" xmlns:ns2="http://schemas.google.com/gCal/2005" /> </ns1:who> </ns0:entry>

Since Google already does a great job of explaining the GData API, this post will not repeat that information. Instead, you can use the collection URL and entry schema in the same fashion as the other GData APIs to create, read, update, and delete calendar resources.

Tuesday, October 7, 2008

Michael McLaughlin

Overcoming Customer Portal Object Access Limitations Using Proxies

If you have ever tried exposing a Campaign, Contract, Lead, Opportunity,
Pricebook, or Product in Customer Portal you have most likely been met
with the dreaded "Insufficient Permissions" screen. Customer Portal hides
these standard objects for obvious reasons (you don't necessarily want external users to access your organization's most proprietary data), however, there are times when allowing read-access to these objects would facilitate certain operations. For example, it would be great to expose your product catalog (i.e. Product, Pricebook, and PricebookEntry) to your customers. How can read-access be achieved given these limitations? The workaround described below uses what I will call "proxy classes" that can stand in for these blocked standard objects.

The first step to using a proxy class is to create a custom object through Salesforce's administrative control panel. This is your opportunity to create an object that contains the fields you want from the standard object plus any additional fields that might be handy such as a formula field concatenating different values or even fields from other classes that you can get to via an object-to-object relationship. The idea here is to create an object that mimics (closely or completely) the standard object that you are otherwise not able to see in Customer Portal. When creating the proxy object the key is to establish a connection to the blocked standard object. This is done by creating a Lookup field on the proxy object that points to the ID field of the standard object. By creating this Lookup, you have now created a foreign key into the standard object. Now you can access other fields in the standard object by leveraging this relationship. In Apex, you can code RelationshipName__r.OtherField to gain access to the other fields...the Lookup you created is the gateway into the object. Remember to enable permissions on the object for Customer Portal users!

Now that your proxy class is created and it mimics the standard object you need to pump some data into it. For an initial data load, use the Apex Data Loader to 1) export data from the standard object into a CSV 2) manipulate the resulting CSV as necessary and 3) map the exported CSV back into an import for your proxy class. An alternative method would be to write an Apex class that loops through the standard object and inserts the data into the proxy class. Use whatever data loading technique you feel most comfortable with.

Armed with a data-populated proxy class you are now ready to expose this data to Customer Portal. You can use this proxy class in place of your standard object in all of your VisualForce pages, tabs, related lists, etc. You are simply using this proxy class that has permissions in Customer Portal in place of blocked standard object. The data is the same (or even customized depending on how you structured the proxy) but now you can see and work with it.

Finally, you will want to keep your proxy object populated with fresh, current data from the standard object. This can easily be done by adding a trigger to the standard object that updates the proxy. Keep in mind that triggers are not allowed on certain classes (for example, Pricebook and PricebookEntry). A creative workaround is to use a batch update as described here.



Thursday, October 2, 2008

Google Earth Integration via Visualforce

 

The VisualForce "contentType" page attribute makes it easy to push data from Salesforce directly to other apps. Here, we'll review an example using Google Earth. We use KML to view Salesforce Opportunities on a 3D map. Let's start with the page itself:

    <apex:page controller="KMLController" cache="true" showHeader="false" contentType="application/vnd.google-earth.kml+xml">
    <kml xmlns="http://earth.google.com/kml/2.0">
    <Document>
    <name>Salesforce Opportunities</name>
    <apex:repeat value="{!oppList}" var="o">
    <Placemark>
    <name>{!o.Name}</name>
    <address>{!o.Account.BillingStreet} {!o.Account.BillingCity}, {!o.Account.BillingState} {!o.Account.BillingPostalCode}</address>
    <description>
    <![CDATA[
    <p><b>Account: </b>{!o.Account.Name}
    <p><b>Amount: </b>${!o.Amount}
    <p><b>Close Date: </b>{!MONTH(o.CloseDate)}/{!DAY(o.CloseDate)}/{!YEAR(o.CloseDate)}
    ]]>
    </description>
    </Placemark>
    </apex:repeat>
    </Document>
    </kml>
    </apex:page>
    Note the following:
      • The contentType="application/vnd.google-earth.kml+xml" attribute notifies the browser that the page content should be passed to Google Earth.

      • The cache="true" attribute addresses this IE security issue.

      • The meat of the page is in an <apex:repeat> block that iterates over a list of Opportunities. In this example, we're mapping the opportunity address, but you could use the Geocoding API to specify a Point with specific longitude and latitude coordinates

        The page controller retrieves a List of Opportunity objects based on a comma-delimited URL parameter:

        public class KMLController {

        public Opportunity[] oppList {get; set;}

        public KMLController() {

        String sel = '';

        if (null != ApexPages.currentPage().getParameters().get('sel')) {

        sel = ApexPages.currentPage().getParameters().get('sel');

        }

        String[] idList = sel.split(',', 0);

        oppList = [SELECT Id, Name, Amount, CloseDate,

        Account.Name, Account.BillingStreet, Account.BillingCity,

        Account.BillingState, Account.BillingPostalCode

        FROM Opportunity

        WHERE id IN :idList];

        }

        }

        Finally, an Opportunity custom button is used to invoke the VisualForce page, passing a list of selected Opportunity Id's from a List View or Related List:

        var sel = {!GETRECORDIDS( $ObjectType.Opportunity)};

        if (!sel.length) {

        alert("Please select at least one opportunity for mapping.");

        } else {

        var d = new Date(); // Append milliseconds to URL to avoid browser caching

        url= "/apex/KMLPush?ms=" + d.getTime() + "&sel=" + {!GETRECORDIDS( $ObjectType.Opportunity)};

        window.location.href=url;

        }

        When the button is clicked, the selected Opportunities will be displayed (via KML) in Google Earth.

        10-2-2008 8-50-13 PM

        If the KML file doesn't open properly, you might need to manually add the following Windows registry entries:

        [HKEY_LOCAL_MACHINE\SOFTWARE\Classes\MIME\Database\Content Type]

        [HKEY_LOCAL_MACHINE\SOFTWARE\Classes\MIME\Database\Content Type\application/vnd.google-earth.kml+xml]

        "Extension"=".kml"

        [HKEY_LOCAL_MACHINE\SOFTWARE\Classes\MIME\Database\Content Type\application/vnd.google-earth.kmz]

        "Extension"=".kmz"

        Tuesday, September 23, 2008

        Page breaks in Visualforce PDF templates

        Kyle
        Roche


        The Visualforce page component defines a renderAs attribute that supports certain content converters. This is extremely useful when automatically printing forms, receipts, reports, etc. Often, we're asked to create nicely formatted forms that span multiple pages. If you leave the control of page breaks to the browser unexpected things can happen. This is an easy solution to solve with some basic CSS. You can use the page-break style properties to control where the browser will insert a page break. The Force.com PDF content converter will carry that over to the PDF.

        Here's the basic code to demonstrate how this works. Create a new Visualforce Page called MultiPagePDF. Add the following code to the page:

        <apex:page renderas="pdf">
        <div style="page-break-after:always;">
        This is page one
        </div>
        <div style="page-break-after:always;">
        This should be rendered as page two
        </div>
        And, finally... page three
        </apex:page>

        9-14-2008 4-24-54 PM This should yield something like this illustration when rendered. Some natural extensions to this posting would be to dynamically insert these into the page. Inserting <div> tags and binding the style to an APEX property could be one way to accomplish this. You would pass back a blank style in some cases and return a string with the value of "page-break-after:always" for <div> sections where a break is needed.

        As a quick side note, you can get a bit more advanced with the Page formatting via CSS. The following snippet shows you have to switch the page layout to landscape and add page numbers to your Visualforce page. This was found in the Case History Timeline example.

        @page {
        /* Landscape orientation */
        size:landscape;

        /* Put page numbers in the top right corner of each
        page in the pdf document. */
        @top-right {
        content: "Page " counter(page);
        }
        }

        Wednesday, September 17, 2008

        Access Custom Fields on Standard Objects in Force.com IDE

        Kyle Freeman

        I recently created a new Force.com Project in Eclipse, but was surprised to find that there was not an option to include Standard Objects in the project manifest. Upon first glance it appears that only Custom Objects can be included.

        9-14-2008 2-35-21 PM

        This force.com thread by JonP discusses a workaround summarized here.

        First, it is worthwhile to note that you cannot include standard fields on standard objects. However, you can include CUSTOM fields on standard objects. From your existing project in Eclipse, open the package.xml file located at src->unpackaged->package.xml Locate the Custom Object section and add a new member for each standard object you would like included in your project.

        This method will include all custom fields on the standard object in your project.
        For example, I was able to use the following snippet to include all custom fields on the Account, Contact, and Case objects, as well as all fields from all custom objects. Simply add the code to the package.xml file and then refresh the project from the server.

        <types>

        <members>*</members>

        <members>Account</members>

        <members>Case</members>

        <members>Contact</members>

        <name>CustomObject</name>

        </types>

        Care should be taken when pushing to production, however as including a standard object will also cause field-level security settings to be pushed to profiles in the salesforce.com production environment. Should your production field-level security settings be out of sync with your sandbox, you could run into some issues.

        Utilizing this method can prove invaluable if you are needing to migrate a large number of custom fields and do not want to recreate these manually in production.

        Monday, September 15, 2008

        Complementing Visualforce - Girafa Thumbnail Service

        Kyle
        Roche


        There are so many technologies out there to complement your Salesforce.com projects. In a previous posting I showed you how to leverage Google's Geocoding API from within your APEX classes. In this example, let's take a look at something more useful in the UI Layer.

        It's pretty typical in projects with heavy Visualforce development to create a custom search form. Recently, I was working on a project where we created a custom search form for the Account object. Instead of showing the Account's logo image, we were thinking it would be a cool enhancement to display a thumbnail to the Account's website. You could use services like Amazon's Alexa Site Thumbnail Service (as shown in the APEX Language Reference). However, most of these aren't free. After some Googling, I stumbled on the Girafa Thumbnail Service, which offers a free service to those who use less than 2000 images requests per day (200 request per day at the time of this writing. Please check girafa for updated daily limits).

        Let's get started. Create a Visualforce page called GirafaThumbnail and a controller called GirafaThumbnailController. In your controller add the following APEX Property.

        public string GirafaUrl
        {
        get
        {
        if (GirafaUrl == null)
        {
        string algorithmName = 'MD5';
        string signatureKey = 'your signature key here';
        string clientId = 'your client id here';
        string siteUrl = 'http://www.appirio.com/';
        string inputString = signatureKey + siteUrl;
        Blob myDigest = Crypto.generateDigest(algorithmName, Blob.valueOf(inputString));
        string myDigestString = EncodingUtil.ConvertToHex(myDigest);
        integer myStartingPosition = myDigestString.length() - 16;
        integer myEndingPosition = myDigestString.length();
        string mySubString = myDigestString.substring(myStartingPosition, myEndingPosition);
        string myUrl = 'http://scst.srv.girafa.com/srv/i?i=' + clientId + '&s=' + mySubString + '&r=' + siteUrl;
        return myUrl;
        }
        return GirafaUrl;
        }
        set;
        }

        Let's break this down by first understanding how girafa thumbnail service works. There are a few basic steps to generating the URL for your thumbnail. Girafa's thumbnail images are embedded using the <img> tag. The call is formatted as follows:

        http://scst.srv.girafa.com/srv/i?i=<client ID>&s=<signature>&r=<site URL>

        client ID is your Girafa Client ID. It is supplied when you register your account. The site URL is the URL of the website for which you would like to generate a Thumbnail. In our example http://www.appirio.com

        The signature authenticates your request and is generates using the following steps:

        1) Concatenate your secrete key (chosen when you create your account) and the Site URL

        2) Calculate the MD5 Checksum of the concatenated string.

        3) The signature is the 16 least significant hexadecimal digits of the MD5 checksum.

        In our method we set the variables we'll need to generate the Image call then we use the Crypto class' generateDigest() method to calculate the MD5 checksum of our concatenated string. Some basic string manipulation returns the 16 character substring and we put our URL together to return to our caller.

        In your Visualforce page add an image tag to see the results. Bind the src attribute of the <img> to our APEX Property. <img src="{!GirafaUrl}"></img>. You should see something along the lines of the following illustration.

        9-14-2008 5-52-59 PM

        Sunday, September 14, 2008

        Escaping Quotes in Merge Fields

        Glenn Weinstein

        S-controls, button on-click Javascript, and Visualforce pages can contain merge fields, which SFDC replaces with their record values prior to rendering. But this is a straight replacement, without any opportunity to escape characters. This leads to messy results when a merge field contains a double-quote character.

        For example, consider using this merge field in on-click Javascript:

        project.Name = "{!Opportunity.Name}";

        That would work great, but consider what happens if the opportunity is named Acme "World Peace" Project (with the double quotes). SFDC will render the code as:

        project.Name = "Acme "World Peace" Project";

        Okay, now you have a huge mess. This will break the Javascript. A very clever workaround for this was created by an Appirio colleague of mine, Linda Evans. Here it is:

        Step 1: Wrap the value in a hidden textarea.
        Step 2: Retrieve the value by traversing the DOM.

        So in our example above, first, you'd drop this HTML element into your page:

        <textarea id="opportunityName" style="display:none">{!Opportunity.Name}</textarea>

        Then, inside a <script> element, you'd retrieve the value:

        project.Name = document.getElementById("opportunityName").value;

        I told you this was clever - thanks Linda!

        One final note, this won't work (directly) in a button on-click Javascript, since you can't put HTML elements in them (the entire body is considered a <script>). So you'll have to move your code into an s-control, and then tie your button instead to the s-control.

        Wednesday, August 27, 2008

        Defaulting your mailto: links to Google Apps in Firefox 3

        Tim Parker

        If you're using Google Apps and Firefox and would like to default email links to open in Google Apps follow the steps below.

        First, make sure you're using Firefox 3. Open Firefox, and enter about:config into the address bar. This will bring up a warning message, click on the "I'll be careful, I promise!" button.

        Capture

        First, we need to edit the option for web pages to register themselves as protocol handlers, so enter gecko.handler in the filter bar and select the option highlighted below.

        gecko.handlerService.allowRegisterFromDifferenceHost

        Capture2

        Make sure this option is set to True (You can do so by double clicking that entry in the browser). This allows mailto: links to forward to web based email providers such as Google Apps. If you wanted to set webcal: links to a web based calendar like Google Calendar you would need this setting enabled as well.

        Now that we've enabled this option we need to register Google Apps as a handler. To do this, we need to execute a single line of JavaScript. In the address bar, copy and paste the following:

        javascript:window.navigator.registerProtocolHandler('mailto','https://mail.google.com/a/yourdomain.com/mail/?extsrc=mailto&url=%s','Google Apps')

        Note that you will need to replace yourdomain.com with your actual domain name. Firefox will prompt you to add an application. Click "Add Application."

        Capture3 Finally, Navigate to Tools / Options / Applications and set Google Apps as your default for mailto.

        8-27-2008 9-57-15 AM

        Wednesday, August 13, 2008

        FIrst look at Dynamic APEX

        Kyle
        Roche


        I opened a case in my Dev org a few days ago to request that Dynamic APEX be enabled. I took my first look at it today. The first step required to take a look at Dynamic APEX is to open a case in your development org. It took about 48 hours to get the setting enabled. The rest of this post and the follow up posts assume that step has been completed.

        Start by creating a new Visualforce page. I'm using a page called /apex/dynamicApex. I created a custom controller called dynamicApexController by changing the <apex:page> component as follows:

        <apex:page controller="dynamicApexController">

        Let's start with the controller. Switch your editor to the controller view and add the following APEX Property. As in my previous posts, I'm using APEX Properties in place of the old getters / setters. For more information on APEX Properties see the Summer 08 Developer's Guide.

        public List<Account> DynamicAccountList
        {
        get
        {
        if (DynamicAccountList == null)
        {
        string myDynamicQuery = 'select id,name from Account limit 10';
        DynamicAccountList = Database.Query(myDynamicQuery);
        }
        return DynamicAccountList;
        }
        set;
        }

        Now, this is obviously a basic example. I'll extend this using some more complicated situations in the coming posts. Since we have Dynamic APEX enabled we can now use Dynamic SOQL, SOQL and DML. We're creating a string to hold our Dynamic SOQL query. We can then pass the string to be evaluated at runtime to the Database.Query() method. The possibilities for customization are endless.

        To display the results on the Visualforce Page add a quick dataTable component.

        <apex:dataTable value="{!DynamicAccountList}" var="acct">
        <apex:column value="{!acct.id}"></apex:column>
        <apex:column value="{!acct.name}"></apex:column>
        </apex:dataTable>

        We'll look at some examples of dynamic queries built on user input in the following posts in this series.

        Saturday, August 9, 2008

        Google Geocoding from Visualforce

        Kyle
        Roche


        Mashups are becoming a common part of most implementations. Replacing legacy applications are sometimes phased into adoption by creating a mashup in Salesforce.com of the current application(s) and slowly replacing components as they are reconstructed using native Salesforce. Google Maps Mashups are among the most popular. However, we found most of the examples were either using the AJAX toolkit or were hard coding the Lattitude / Longitude coordinates. In this example, we'll demonstrate how to use Google's Geocoding API to geocode an Account address from with your Visualforce controller. The key difference is that this example geocodes the address using server side scripting.

        Start off by creating a new APEX Class called GoogleGeocodeExtension. This will be our controller extension. Remember, controller extensions have constructors that take an argument of controller for which they are extending. In this example, we'll be extending the standard controller for the Account object. Make sure your class looks like the following.

        public class GoogleGeocodeExtension {
        private final Account acct;

        public GoogleGeocodeExtension (ApexPages.StandardController stdController) {
        this.acct = (Account)stdController.getRecord();
        }
        }

        Google's Geocoding API can be accessed via server side scripting. You can choose different output formats like XML, CSV, JSON (default). In our case, we'll keep things simple and return the results in CSV format. Add the following property to your APEX class.

        public string[] Coordinates
        {
        get
        {
        if (Coordinates == null)
        {
        Account myAccount = [select name,billingstreet,billingcity,billingstate,billingpostalcode from Account where id=:acct.id];
        String url = 'http://maps.google.com/maps/geo?';
        url += 'q=' + EncodingUtil.urlEncode(myAccount.BillingStreet,'UTF-8') + ',' + EncodingUtil.urlEncode(myAccount.BillingCity,'UTF-8') + ',' + myAccount.BillingState;
        url += '&output=csv&key=yourgooglemapkeyhere';

        Http h = new Http();
        HttpRequest req = new HttpRequest();

        req.setHeader('Content-type', 'application/x-www-form-urlencoded');
        req.setHeader('Content-length', '0');
        req.setEndpoint(url);
        req.setMethod('POST');

        HttpResponse res = h.send(req);
        String responseBody = res.getBody();
        Coordinates = responseBody.split(',',0);
        }
        return Coordinates;
        }
        set;
        }

        This APEX property queries the billing address for the Account record and passes it to the Google Geocoding API. Because spaces and other special characters can appear in addresses and city names we need to use the urlEncode() method to properly format these strings.

        We chose to use the CSV format on our response. So, we simply need to split the string by the comma delimiter so we can access each field individually. To keep things simple you can add the following two properties to your controller extension.

        public string CoordinateLat { get { return Coordinates[2]; } }
        public string CoordinateLong { get { return Coordinates[3]; } }

        Like any other APEX property in a controller or extension you can access these in your Visualforce page using the standard binding syntax {!CoordinateLat}.

        Friday, July 25, 2008

        Extending Visualforce's UI - Ext JS DataGrid


        Roche
        In this posting we're going to use the Ext JS library to build out a custom DataGrid in a Visualforce page. Ext JS is a JavaScript library for building richer UI layers with out the need for plug in technologies. Ext JS has some fantastic UI widgets and Open Source licenses are available.


        This technique will use the Simple Data Store. In future postings we'll build on this with examples using JSON and other technologies. The first step is to download the latest version of the Ext JS SDK. We're going to upload the archive as a Static Resource so we can reference it from within our Visualforce application. Remember, there's a 5MB per file limit on Static Resources so make sure you remove the sample directory from the SDK before trying to upload it as a Static Resource. Your upload resource should look like this:

        StaticResourcesReferencing a Static Resource
        Static Resources can be referenced by using the $Resource global variable. Archives are especially interesting because you can reference the full path to a file within an archive saving you countless effort organizing your uploads. We'll be referencing the following resources for this example.

        StyleSheet: /ext-2.1/resources/css/ext-all.css
        JavaScript: /ext-2.1/adapter/xt/ext-base.js
        JavaScript: /ext-2.1/ext-all.js

        Creating the Visualforce Page
        First, make sure you have development mode enabled. Setup | My Personal Information | Personal Information | Development Mode. Create a new Visualforce page by redirecting your browser to http://server.salesforce.com/apex/ExtJs_DataGrid_Part1. Follow the prompt to create your page. Development mode enables two things in your org. First, you now have the ability to create a page via the URL, as you've just done. Secondly, you now have the ability to edit Visualforce pages from within your browser. In the bottom left of the browser click on Page Editor to open the Visualforce Editor. Add the following code to reference our Static Resources with our Ext JS library.

        <link rel="Stylesheet" type="text/css" href="{!$Resource.ExtJs}/ext-2.1/resources/css/ext-all.css" />
        <script type="text/javascript" src="{!$Resource.ExtJs}/ext-2.1/adapter/ext/ext-base.js"></script>
        <script type="text/javascript" src="{!$Resource.ExtJs}/ext-2.1/ext-all.js"></script>

        Click Save. Our page is now referencing the three static resources within the archive we uploaded. Note the use of the $Resource variable in our data binding calls.

        Gathering Some Sample Data
        Ext JS' "Simple Data Store" is built by constructing a string based Array which we'll bind our DataGrid. Let's create a customer controller to retrieve the data for our Visualforce page. In the Page Editor change the <apex:page> component to read as follows.

        <apex:page sidebar="false" controller="ExtJSDataGrid1Controller">

        You will see an option above the toolbar in the Editor to create a new APEX Class. After clicking that link you should see a new button next to Page Editor called Controller. Open the code for the Controller and paste in the following. Note the use of the Apex Property in place of the traditional getter and setter methods. This is new to Summer 08.

        public class ExtJSDataGrid1Controller {
        public List<Contact> myContacts {
        get {
        if (myContacts == null) {
        myContacts = [SELECT Id, FirstName, LastName FROM Contact LIMIT 10];
        }
        return myContacts;
        }
        set;
        }
        }

        Adding Ext JS to our Visualforce Page
        Now that we have our SOQL query returning 10 Contact records we're going to add the JavaScript to generate our Ext JS DataGrid. Paste the code below into the Page Editor. Note our use of the <apex:repeat> component to build out our DataStore. Typically, the use case for the <apex:repeat> component would be to build out repeating UI elements. In this situation we're using it to build out our <script> for the rendered page.

        <script type="text/javascript">
        Ext.onReady(function(){
        Ext.state.Manager.setProvider(new Ext.state.CookieProvider());
        var myDataString = 'var myData = [ ';
        <apex:repeat value="{!myContacts}" var="con" id="ContactRepeat">
        myDataString += "['{!con.Id}','{!con.FirstName}','{!con.LastName}'],";
        </apex:repeat>
        myDataString += "['','',''] ];";
        eval(myDataString);
        var store = new Ext.data.SimpleStore({fields:[{name:'ID'},{name:'FirstName'},{name:'LastName'}]});
        store.loadData(myData);
        // CREATE THE GRID
        var grid = new Ext.grid.GridPanel({store: store, columns: [
        {id: 'ID', header: "ID", width: 50, sortable: true, dataIndex: 'ID'},
        {id: 'FirstName', header: "First Name", width: 150, sortable: true, dataIndex: 'FirstName'},
        {id: 'LastNme', header: "Last Name", width: 150, sortable: true, dataIndex: 'LastName'}
        ],stripeRows:true, autoExpandColumn: 'ID', height: 500, width: 1000, title: 'MY EXT JS CONTACT LIST'});

        grid.render('myContactList-grid');
        grid.getSelectionModel().selectFirstRow();
        });
        </script>

        Don't Forget the DataGrid

        Add the following <div> tag and click Save to view your new Ext JS driven DataGrid.
        <div id="myContactList-grid"></div>


        ExtJSDataGrid_Part1_Complete

        Extending Visualforce's UI - Series Intro

        Kyle
        Roche


        Welcome to the first in an on-going series of postings exploring the several approaches that are available for extending the Visualforce UI using different sets of 3rd party components. We'll look at technologies like Flex, Ext JS, YUI, and Google's Chart API to name a few.

        Overview of Visualforce
        Visualforce allows you to abstract the user interface layer from your standard Salesforce org and create something completely unique. Visualforce is the gateway to PaaS on the Force.com platform. Visualforce is built on the MVC Design pattern and leverages APEX define UI behaviors and navigation routines.

        Extending Visualforce's UI Layer
        In many cases the standard Visualforce UI elements available at GA don't meet every need. Instead of reverting to client side technologies and making calls back through the AJAX Toolkit we're going to walk you through some techniques to integrate 3rd party UI libraries and still leverage the server side power of Visualforce!

        Static Resources in Visualforce
        With Summer 08, we have a new kind of Salesforce storage designed specifically with Visualforce in mind, called Static Resources. You can upload files to Static Resources via Setup | Develop | Static Resources. Libraries or files uploaded to Static Resources can become items that you reference in your Visualforce pages. Things like stylesheets, JavaScript, images and even archives can be linked to through the $Resource global variable. For more information on Static Resources see the Visualforce Developer's Guide. Organize your Static Resources by compiling the items needed for your Visualforce Application into archive files (Remember, there is a 5MB per file limit). Visualforce can bind to files within the archive so there's no need to upload each file individually!

        Stay tuned for our next posting where we'll integrate a DataGrid from Ext JS into a Visualforce Page.

        Salesforce Customer Portal CSS Modifications

        Michael
        McLaughlin


        The Problem: How do I modify my Customer Portal implementation to match my public facing website?


        The Salesforce Customer Portal is a great extension to Salesforce.com that allows companies to open up certain areas of their Salesforce organization to designated customer users for self-servicing their accounts. The Portal lets customers make account changes, log and view status on a case and provides answers to common customer questions...all without involving a support representative and inherent response delay.

        Out of the box, the customer portal retains the standard Salesforce.com look and feel we have all grown accustomed to. End user of the portal (the customer's customers) however, will most likely be unfamiliar with this user interface. In addition, since most Customer Portal implementations hang off of the main website of a customer, there could be confusion for end users if they encounter the standard salesforce.com look and feel.

        The Solution: Customer Portal CSS Mods

        One solution to this problem is to format salesforce.com Customer Portal to match the style of organization's website. This gives the customer's end user the impression that he is still on the organization's site.

        Customer Portal allows minimal customization of fonts and colors through a simple administration interface. This also allows you to supply a custom header and footer. In order to do more than this and complete the visual transformation of your Customer Portal implementation you can use Cascading Style Sheets (CSS).

        Salesforce does not produce documentation explaining what CSS classes define which Customer Portal screen elements. This allows them to change these style elements in future releases. As a result, unlike supported salesforce.com customizations, you may need to verify your changes still work with future salesforce.com releases. In other words, be careful with the below :-)

        You need to use a tool to help you get under the covers to see the interaction of CSS with what the users sees. Users of Mozilla Firefox have access to an excellent free add-on called Firebug. Firebug allows you to "inspect" the content and structure of any web page. Turn Firebug on and navigate to the pages of your Customer Portal implementation. Hover over the screen elements you wish to modify such as table headers, background images, etc. Take note of the CSS classes Firebug says are responsible for the elements' formatting.

        Now that you know what out of the box Customer Portal styles are driving the interface elements you wish to change, you can override them using your own style definitions. You must use the same style class names so your browser will use your style definition and not the default one. Do this by:
        • Modifying the style to your liking in CSS syntax
        • Save your changes to a CSS file you store in Static Resources or Documents
        • Reference your CSS file in the Customer Portal header HTML (accessible in the Customer Portal Administration panel) using a <link> tag
          • OR, if your changes are small, simply use a <style> tag and insert your styles as HTML directly into the Customer Portal header
        • View the resulting changes in Customer Portal and tweak as necessary
        • Use Firebug to hover over your screen elements to see if your styles are being properly used
        Further reading: Example of the solution in action:
        <style type=text/css>
        /* Redefining this style for Customer Portal to make the font smaller */
        body {
        font-size:9pt;
        }

        /* Redefining this style for Customer Portal to make the listHeader an image */
        .listHeader {background-image: url(/dimg/portalTabRight96999C96999C.gif);}

        /* Redefining to make sidebar an image. Kept its present color and font definitions. */
        .sidebarModuleHeader,
        .nestedModule .sidebarModuleHeader {
        background-image: url(/dimg/portalTabRight96999C96999C.gif);
        color: #505154;
        font-family: Arial, Helvetica, sans-serif;
        }
        ...

        Screenshot of Firebug inspecting Customer Portal's style elements:

        Wednesday, July 16, 2008

        Welcome to the Appirio Tech Blog

        The team here at Appirio spends a lot of time building innovative plugins, apps and even complete business solutions using Software-as-a-Service (SaaS) and Platform-as-a-Service (PaaS) technologies from Google and salesforce.com. We like how easy it is to build these solutions-- that's what makes cloud computing so different (and so disruptive). But easy isn't automatic (no matter what your boss thinks). That's why we're writing this blog.

        Internally, the Appirio team talks a lot about design patterns, issues, work arounds and cool tricks to make it build on use web platforms. It fun to find a new way to solve a problem and a badge of honor to see who can do it with the "least lines of code".

        We want you to be a part of our conversation. Why? So that you get as excited as we are about the power of cloud computing and can build your own cool applications. This blog is for you if want to build on the cloud, whether you are a power-admin or an apex wizard, whether you are a longtime Salesforce and Google customer or just getting started, whether you are an individual contributor or are managing an entire team of developers. We won't have code in every post, but we'll definitely be thinking about it.

        Here are a few of the topics we're thinking about, feel free to suggest your own.
        • Salesforce's VisualForce allows developers to design any UI on top of the Salesforce data model. What's possible with VisualForce today, and how can you make the most it?
        • Salesforce's Apex programming language gives you a lot of flexibility and power, but using it effectively requires understanding its capabilities and limitations. Where can you find tips and tricks for developing in Apex?
        • Salesforce's Customer and Partner portal lets you share critical business data information with people outside your company. How can you develop effective portal applications?
        • Google's AppEngine lets you build the next great web app, but is still lacking basic features for enterprise use (e.g., support for HTTPS). Where does AppEngine fit for enterprises and how can you use it with that in mind?
        • Everyone wants to use their on-demand applications from mobile devices like the BlackBerry and iPhone. What's the best way to enable these capabilities and finally unchain your users from their desks (e.g. super monkey ball for the enterprise)?
        • Security is top-of-mind for any company looking to move more of their business onto the cloud. What shoudl you know when configuring roles and profiles, establishing authentication and authorization, and setting up single-sign on?
        As you can tell, there's a lot of ground to cover. We'll have folks from Appirio and occassionally some guest writers help us get there.