Lead Management TipsThe 10 most important tips for a better data quality
The term Lead Management stands for generating, processing and evaluating of prospect and customer data. The aim is to make even more relevant marketing with this data, to gain more turnover at the end of the day.
The success of Lead Management is unfortunately often affected by an insufficient quality of the data base. This may have the consequences, that the status of a lead in the buying process couldn’t be defined exactly, that an accurate definition of target groups isn’t possible, or that a lead isn’t addressed correctly.
We have gathered our best tips from practice, which should help you to ensure a high quality of your lead data.
10 Tips for successful Lead Management due to good data quality
1. Define in advance which character set you want to use for your data that umlauts and special symbols won’t get defective.
Our recommendation: For exclusive use in the German-speaking area take the character set ISO08859-15 otherwise take UTF-8. Ensure, that all external data suppliers provide the data with the same coding. This can be checked by visual inspection of just a few test data records (e.g. open the file with Excel in the needed coding and vet if the umlauts and special symbols are shown correctly).
2. Work with a separate and unique number (ID) per data record, which you can use as reference.
If you use an already existing field as reference instead – the e-mail address for example – and the entry of this field changes afterwards, all references for this record would get lost.
3. Consider which fields should be alphanumeric instead of numeric beforehand.
A field for the zip code has to be alphanumeric for example. Otherwise zip codes like “01234” become shortened to “1234” automatically. Also a field for the telephone number has to be alphanumeric. This allows beside clearer entries like “++49 89 / 55 29 08-0” instead of “0049895529080”.
4. Define a consistent format for the display of date fields.
Example: “31.12.2015” respectively with time: “31.12.2015 23:59:59”
Therefore the order of the single date components is defined clearly.
5. Due to a current occasion: For internal use choose a numeric value for the gender.
This makes the list extendable and is not limited to the classics “male”, “female” and “unknown” (see at the current development by Facebook).
6. Make a validation for external data sources in advance.
Examples for validations: Check whether all mandatory fields are filled, e-mail addresses contain the @-symbol and end with a valid top level domain, zip codes are 5-digit (Germany) respectively 4-digit (Austria and Switzerland), etc.
7. Define for each data source an own data source ID.
The data source ID is a unique identifier for each data source. Thereby you can still locate on which way a record was added to the database at a later time.
8. Add the fields “CREATION_DATE” and “CHANGE_DATE” to your data records.
You can define that the field CREATION_DATE should be filled with the current date when a new record is inserted. The field CHANGE_DATE should be filled with the current date each time a record is edited. Thus you always know when a specific record has been added or changed.
9. Standardize and automatize manual data operations.
Standardize and automatize data synchronizations and validations as far as possible. Thereby you can exclude human sources of error.
10. Validate periodically the quality of your data. Count the number of duplicates and make a random visual inspection.
With help of the last measure you can find occasionally errors which couldn’t be found with automatic validations, e.g. fake data or wrong formats.