The processing costs for selecting a value from a database-table are fairly high compared to the costs having the value already in memory. So it seems preferrable to use some smart caching-mechanism that keeps often used values in your application instead of retrieving these values from resources somewhere ‘outside’.
Most frameworks have at least one cache implementation onboard, but there also exist several other implementations of caches like e.g. EHCache. Even ordinary HashMaps/Hashtables can serve as caches also.
A critial factor when using caches in Java is the size of the cache: when your cache grows too big, the Java Garbage Collector has to cleanup more often (which consumes time) or your application even crashes with a java.lang.OutOfMemoryError.
One way to control the memory-consumption of caches is to use SoftReferences in HashMaps/Hashtables, another one is to throw away old or unused content by implementing a caching-strategy like e.g. LRU.
A simple LRU-cache already ships within the components of the Java Standard Library: the LinkedHashMap. All you have to do is to tell your application whether the eldest entry in the map should be retained or removed after a new entry is inserted. Additionally a special constructor has to be used that defines the orderingMode for the map: ‘true’ for access-order (LRU), ‘false’ for insertion-order.
Suppose we want to cache a mapping of String-Names to String-Ids with a maximum size of 20480 entries.
How that can be done is shown by the example below with the use of an Anonymous Inner Class that overrides the removeEldestEntry-method of the LinkedHashMap.
import com.fmr.pzn.wmb.utils.Vaidator;
import java.util.Collections;
import java.util.LinkedHashMap;
import java.util.Map;
public final class SampleCache {
private static final float LOAD_FACTOR = 0.75f;
private static final int CACHE_MAX = 20480;
private static final int CACHE_INIT = 10000;
private static final LinkedHashMap<String, String> CACHE_MAP = new LinkedHashMap<String, String>(CACHE_INIT,
LOAD_FACTOR, true) {
private static final long serialVersionUID = 628057782352519437L;
@Override
protected boolean removeEldestEntry(Entry<String, String> eldest) {
return size() > SampleCache.CACHE_MAX;
}
};
private SampleCache() {
super();
}
public static void putid(final String id, final String name) {
if (isEmpty(id) || isEmpty(name)) {
return;
}
CACHE_MAP.put(id, name);
}
public static String getByid(final String id) {
return CACHE_MAP.get(id);
}
public static boolean isEmpty(final String field ){
boolean isEmpty = false;
if(field == null || field.isEmpty())
{
isEmpty = true;
}
return isEmpty;
}
}
And based on the high performance environment, this cache will not thread safe. How to fix this small issue? we can use Collections.synchronizedMap() to synchronized cache. like:
private static final Map<String, String> CACHE_MAP = Collections.synchronizedMap(new LinkedHashMap<String, String>(CACHE_INIT,
LOAD_FACTOR, true) {
private static final long serialVersionUID = 628057782352519437L;
@Override
protected boolean removeEldestEntry(Entry<String, String> eldest) {
return size() > SampleCache.CACHE_MAX;
}
});
Now, there is one small cache exists in your application.
posted on 2012-08-07 16:11
ゞ沉默是金ゞ 阅读(419)
评论(0) 编辑 收藏