Commit 3ddd7ac6 authored by mhahnenberg@apple.com's avatar mhahnenberg@apple.com

Marking should be generational

https://bugs.webkit.org/show_bug.cgi?id=126552

Reviewed by Geoffrey Garen.

Source/JavaScriptCore: 

Re-marking the same objects over and over is a waste of effort. This patch implements 
the sticky mark bit algorithm (along with our already-present write barriers) to reduce 
overhead during garbage collection caused by rescanning objects.

There are now two collection modes, EdenCollection and FullCollection. EdenCollections
only visit new objects or objects that were added to the remembered set by a write barrier.
FullCollections are normal collections that visit all objects regardless of their 
generation.

In this patch EdenCollections do not do anything in CopiedSpace. This will be fixed in 
https://bugs.webkit.org/show_bug.cgi?id=126555.

* bytecode/CodeBlock.cpp:
(JSC::CodeBlock::visitAggregate):
* bytecode/CodeBlock.h:
(JSC::CodeBlockSet::mark):
* dfg/DFGOperations.cpp:
* heap/CodeBlockSet.cpp:
(JSC::CodeBlockSet::add):
(JSC::CodeBlockSet::traceMarked):
(JSC::CodeBlockSet::rememberCurrentlyExecutingCodeBlocks):
* heap/CodeBlockSet.h:
* heap/CopiedBlockInlines.h:
(JSC::CopiedBlock::reportLiveBytes):
* heap/CopiedSpace.cpp:
(JSC::CopiedSpace::didStartFullCollection):
* heap/CopiedSpace.h:
(JSC::CopiedSpace::heap):
* heap/Heap.cpp:
(JSC::Heap::Heap):
(JSC::Heap::didAbandon):
(JSC::Heap::markRoots):
(JSC::Heap::copyBackingStores):
(JSC::Heap::addToRememberedSet):
(JSC::Heap::collectAllGarbage):
(JSC::Heap::collect):
(JSC::Heap::didAllocate):
(JSC::Heap::writeBarrier):
* heap/Heap.h:
(JSC::Heap::isInRememberedSet):
(JSC::Heap::operationInProgress):
(JSC::Heap::shouldCollect):
(JSC::Heap::isCollecting):
(JSC::Heap::isWriteBarrierEnabled):
(JSC::Heap::writeBarrier):
* heap/HeapOperation.h:
* heap/MarkStack.cpp:
(JSC::MarkStackArray::~MarkStackArray):
(JSC::MarkStackArray::clear):
(JSC::MarkStackArray::fillVector):
* heap/MarkStack.h:
* heap/MarkedAllocator.cpp:
(JSC::isListPagedOut):
(JSC::MarkedAllocator::isPagedOut):
(JSC::MarkedAllocator::tryAllocateHelper):
(JSC::MarkedAllocator::addBlock):
(JSC::MarkedAllocator::removeBlock):
(JSC::MarkedAllocator::reset):
* heap/MarkedAllocator.h:
(JSC::MarkedAllocator::MarkedAllocator):
* heap/MarkedBlock.cpp:
(JSC::MarkedBlock::clearMarks):
(JSC::MarkedBlock::clearRememberedSet):
(JSC::MarkedBlock::clearMarksWithCollectionType):
(JSC::MarkedBlock::lastChanceToFinalize):
* heap/MarkedBlock.h: Changed atomSize to 16 bytes because we have no objects smaller
than 16 bytes. This is also to pay for the additional Bitmap for the remembered set.
(JSC::MarkedBlock::didConsumeEmptyFreeList):
(JSC::MarkedBlock::setRemembered):
(JSC::MarkedBlock::clearRemembered):
(JSC::MarkedBlock::atomicClearRemembered):
(JSC::MarkedBlock::isRemembered):
* heap/MarkedSpace.cpp:
(JSC::MarkedSpace::~MarkedSpace):
(JSC::MarkedSpace::resetAllocators):
(JSC::MarkedSpace::visitWeakSets):
(JSC::MarkedSpace::reapWeakSets):
(JSC::VerifyMarked::operator()):
(JSC::MarkedSpace::clearMarks):
* heap/MarkedSpace.h:
(JSC::ClearMarks::operator()):
(JSC::ClearRememberedSet::operator()):
(JSC::MarkedSpace::didAllocateInBlock):
(JSC::MarkedSpace::clearRememberedSet):
* heap/SlotVisitor.cpp:
(JSC::SlotVisitor::~SlotVisitor):
(JSC::SlotVisitor::clearMarkStack):
* heap/SlotVisitor.h:
(JSC::SlotVisitor::markStack):
(JSC::SlotVisitor::sharedData):
* heap/SlotVisitorInlines.h:
(JSC::SlotVisitor::internalAppend):
(JSC::SlotVisitor::unconditionallyAppend):
(JSC::SlotVisitor::copyLater):
(JSC::SlotVisitor::reportExtraMemoryUsage):
(JSC::SlotVisitor::heap):
* jit/Repatch.cpp:
* runtime/JSGenericTypedArrayViewInlines.h:
(JSC::JSGenericTypedArrayView<Adaptor>::visitChildren):
* runtime/JSPropertyNameIterator.h:
(JSC::StructureRareData::setEnumerationCache):
* runtime/JSString.cpp:
(JSC::JSString::visitChildren):
* runtime/StructureRareDataInlines.h:
(JSC::StructureRareData::setPreviousID):
(JSC::StructureRareData::setObjectToStringValue):
* runtime/WeakMapData.cpp:
(JSC::WeakMapData::visitChildren):

Source/WTF: 

* wtf/Bitmap.h:
(WTF::WordType>::count): Added a cast that became necessary when Bitmap
is used with smaller types than int32_t.


git-svn-id: http://svn.webkit.org/repository/webkit/trunk@161615 268f45cc-cd09-0410-ab3c-d52691b4dbfc
parent e3271440
2014-01-07 Mark Hahnenberg <mhahnenberg@apple.com>
Marking should be generational
https://bugs.webkit.org/show_bug.cgi?id=126552
Reviewed by Geoffrey Garen.
Re-marking the same objects over and over is a waste of effort. This patch implements
the sticky mark bit algorithm (along with our already-present write barriers) to reduce
overhead during garbage collection caused by rescanning objects.
There are now two collection modes, EdenCollection and FullCollection. EdenCollections
only visit new objects or objects that were added to the remembered set by a write barrier.
FullCollections are normal collections that visit all objects regardless of their
generation.
In this patch EdenCollections do not do anything in CopiedSpace. This will be fixed in
https://bugs.webkit.org/show_bug.cgi?id=126555.
* bytecode/CodeBlock.cpp:
(JSC::CodeBlock::visitAggregate):
* bytecode/CodeBlock.h:
(JSC::CodeBlockSet::mark):
* dfg/DFGOperations.cpp:
* heap/CodeBlockSet.cpp:
(JSC::CodeBlockSet::add):
(JSC::CodeBlockSet::traceMarked):
(JSC::CodeBlockSet::rememberCurrentlyExecutingCodeBlocks):
* heap/CodeBlockSet.h:
* heap/CopiedBlockInlines.h:
(JSC::CopiedBlock::reportLiveBytes):
* heap/CopiedSpace.cpp:
(JSC::CopiedSpace::didStartFullCollection):
* heap/CopiedSpace.h:
(JSC::CopiedSpace::heap):
* heap/Heap.cpp:
(JSC::Heap::Heap):
(JSC::Heap::didAbandon):
(JSC::Heap::markRoots):
(JSC::Heap::copyBackingStores):
(JSC::Heap::addToRememberedSet):
(JSC::Heap::collectAllGarbage):
(JSC::Heap::collect):
(JSC::Heap::didAllocate):
(JSC::Heap::writeBarrier):
* heap/Heap.h:
(JSC::Heap::isInRememberedSet):
(JSC::Heap::operationInProgress):
(JSC::Heap::shouldCollect):
(JSC::Heap::isCollecting):
(JSC::Heap::isWriteBarrierEnabled):
(JSC::Heap::writeBarrier):
* heap/HeapOperation.h:
* heap/MarkStack.cpp:
(JSC::MarkStackArray::~MarkStackArray):
(JSC::MarkStackArray::clear):
(JSC::MarkStackArray::fillVector):
* heap/MarkStack.h:
* heap/MarkedAllocator.cpp:
(JSC::isListPagedOut):
(JSC::MarkedAllocator::isPagedOut):
(JSC::MarkedAllocator::tryAllocateHelper):
(JSC::MarkedAllocator::addBlock):
(JSC::MarkedAllocator::removeBlock):
(JSC::MarkedAllocator::reset):
* heap/MarkedAllocator.h:
(JSC::MarkedAllocator::MarkedAllocator):
* heap/MarkedBlock.cpp:
(JSC::MarkedBlock::clearMarks):
(JSC::MarkedBlock::clearRememberedSet):
(JSC::MarkedBlock::clearMarksWithCollectionType):
(JSC::MarkedBlock::lastChanceToFinalize):
* heap/MarkedBlock.h: Changed atomSize to 16 bytes because we have no objects smaller
than 16 bytes. This is also to pay for the additional Bitmap for the remembered set.
(JSC::MarkedBlock::didConsumeEmptyFreeList):
(JSC::MarkedBlock::setRemembered):
(JSC::MarkedBlock::clearRemembered):
(JSC::MarkedBlock::atomicClearRemembered):
(JSC::MarkedBlock::isRemembered):
* heap/MarkedSpace.cpp:
(JSC::MarkedSpace::~MarkedSpace):
(JSC::MarkedSpace::resetAllocators):
(JSC::MarkedSpace::visitWeakSets):
(JSC::MarkedSpace::reapWeakSets):
(JSC::VerifyMarked::operator()):
(JSC::MarkedSpace::clearMarks):
* heap/MarkedSpace.h:
(JSC::ClearMarks::operator()):
(JSC::ClearRememberedSet::operator()):
(JSC::MarkedSpace::didAllocateInBlock):
(JSC::MarkedSpace::clearRememberedSet):
* heap/SlotVisitor.cpp:
(JSC::SlotVisitor::~SlotVisitor):
(JSC::SlotVisitor::clearMarkStack):
* heap/SlotVisitor.h:
(JSC::SlotVisitor::markStack):
(JSC::SlotVisitor::sharedData):
* heap/SlotVisitorInlines.h:
(JSC::SlotVisitor::internalAppend):
(JSC::SlotVisitor::unconditionallyAppend):
(JSC::SlotVisitor::copyLater):
(JSC::SlotVisitor::reportExtraMemoryUsage):
(JSC::SlotVisitor::heap):
* jit/Repatch.cpp:
* runtime/JSGenericTypedArrayViewInlines.h:
(JSC::JSGenericTypedArrayView<Adaptor>::visitChildren):
* runtime/JSPropertyNameIterator.h:
(JSC::StructureRareData::setEnumerationCache):
* runtime/JSString.cpp:
(JSC::JSString::visitChildren):
* runtime/StructureRareDataInlines.h:
(JSC::StructureRareData::setPreviousID):
(JSC::StructureRareData::setObjectToStringValue):
* runtime/WeakMapData.cpp:
(JSC::WeakMapData::visitChildren):
2014-01-09 Joseph Pecoraro <pecoraro@apple.com>
Unreviewed Windows build fix for r161563.
......
......@@ -1954,15 +1954,15 @@ void CodeBlock::visitAggregate(SlotVisitor& visitor)
if (CodeBlock* otherBlock = specialOSREntryBlockOrNull())
otherBlock->visitAggregate(visitor);
visitor.reportExtraMemoryUsage(sizeof(CodeBlock));
visitor.reportExtraMemoryUsage(ownerExecutable(), sizeof(CodeBlock));
if (m_jitCode)
visitor.reportExtraMemoryUsage(m_jitCode->size());
visitor.reportExtraMemoryUsage(ownerExecutable(), m_jitCode->size());
if (m_instructions.size()) {
// Divide by refCount() because m_instructions points to something that is shared
// by multiple CodeBlocks, and we only want to count it towards the heap size once.
// Having each CodeBlock report only its proportional share of the size is one way
// of accomplishing this.
visitor.reportExtraMemoryUsage(m_instructions.size() * sizeof(Instruction) / m_instructions.refCount());
visitor.reportExtraMemoryUsage(ownerExecutable(), m_instructions.size() * sizeof(Instruction) / m_instructions.refCount());
}
visitor.append(&m_unlinkedCode);
......
......@@ -1269,6 +1269,9 @@ inline void CodeBlockSet::mark(void* candidateCodeBlock)
return;
(*iter)->m_mayBeExecuting = true;
#if ENABLE(GGC)
m_currentlyExecuting.append(static_cast<CodeBlock*>(candidateCodeBlock));
#endif
}
} // namespace JSC
......
......@@ -850,6 +850,7 @@ char* JIT_OPERATION operationReallocateButterflyToHavePropertyStorageWithInitial
NativeCallFrameTracer tracer(&vm, exec);
ASSERT(!object->structure()->outOfLineCapacity());
DeferGC deferGC(vm.heap);
Butterfly* result = object->growOutOfLineStorage(vm, 0, initialOutOfLineCapacity);
object->setButterflyWithoutChangingStructure(vm, result);
return reinterpret_cast<char*>(result);
......@@ -860,6 +861,7 @@ char* JIT_OPERATION operationReallocateButterflyToGrowPropertyStorage(ExecState*
VM& vm = exec->vm();
NativeCallFrameTracer tracer(&vm, exec);
DeferGC deferGC(vm.heap);
Butterfly* result = object->growOutOfLineStorage(vm, object->structure()->outOfLineCapacity(), newSize);
object->setButterflyWithoutChangingStructure(vm, result);
return reinterpret_cast<char*>(result);
......
......@@ -45,7 +45,8 @@ CodeBlockSet::~CodeBlockSet()
void CodeBlockSet::add(PassRefPtr<CodeBlock> codeBlock)
{
bool isNewEntry = m_set.add(codeBlock.leakRef()).isNewEntry;
CodeBlock* block = codeBlock.leakRef();
bool isNewEntry = m_set.add(block).isNewEntry;
ASSERT_UNUSED(isNewEntry, isNewEntry);
}
......@@ -101,9 +102,20 @@ void CodeBlockSet::traceMarked(SlotVisitor& visitor)
CodeBlock* codeBlock = *iter;
if (!codeBlock->m_mayBeExecuting)
continue;
codeBlock->visitAggregate(visitor);
codeBlock->ownerExecutable()->methodTable()->visitChildren(codeBlock->ownerExecutable(), visitor);
}
}
void CodeBlockSet::rememberCurrentlyExecutingCodeBlocks(Heap* heap)
{
#if ENABLE(GGC)
for (size_t i = 0; i < m_currentlyExecuting.size(); ++i)
heap->addToRememberedSet(m_currentlyExecuting[i]->ownerExecutable());
m_currentlyExecuting.clear();
#else
UNUSED_PARAM(heap);
#endif // ENABLE(GGC)
}
} // namespace JSC
......@@ -30,10 +30,12 @@
#include <wtf/Noncopyable.h>
#include <wtf/PassRefPtr.h>
#include <wtf/RefPtr.h>
#include <wtf/Vector.h>
namespace JSC {
class CodeBlock;
class Heap;
class SlotVisitor;
// CodeBlockSet tracks all CodeBlocks. Every CodeBlock starts out with one
......@@ -65,11 +67,16 @@ public:
// mayBeExecuting.
void traceMarked(SlotVisitor&);
// Add all currently executing CodeBlocks to the remembered set to be
// re-scanned during the next collection.
void rememberCurrentlyExecutingCodeBlocks(Heap*);
private:
// This is not a set of RefPtr<CodeBlock> because we need to be able to find
// arbitrary bogus pointers. I could have written a thingy that had peek types
// and all, but that seemed like overkill.
HashSet<CodeBlock* > m_set;
Vector<CodeBlock*> m_currentlyExecuting;
};
} // namespace JSC
......
......@@ -42,6 +42,9 @@ inline void CopiedBlock::reportLiveBytes(JSCell* owner, CopyToken token, unsigne
#endif
m_liveBytes += bytes;
if (isPinned())
return;
if (!shouldEvacuate()) {
pin();
return;
......
......@@ -316,4 +316,17 @@ bool CopiedSpace::isPagedOut(double deadline)
|| isBlockListPagedOut(deadline, &m_oversizeBlocks);
}
void CopiedSpace::didStartFullCollection()
{
ASSERT(heap()->operationInProgress() == FullCollection);
ASSERT(m_fromSpace->isEmpty());
for (CopiedBlock* block = m_toSpace->head(); block; block = block->next())
block->didSurviveGC();
for (CopiedBlock* block = m_oversizeBlocks.head(); block; block = block->next())
block->didSurviveGC();
}
} // namespace JSC
......@@ -60,6 +60,8 @@ public:
CopiedAllocator& allocator() { return m_allocator; }
void didStartFullCollection();
void startedCopying();
void doneCopying();
bool isInCopyPhase() { return m_inCopyingPhase; }
......@@ -80,6 +82,8 @@ public:
static CopiedBlock* blockFor(void*);
Heap* heap() const { return m_heap; }
private:
static bool isOversize(size_t);
......
......@@ -253,9 +253,11 @@ Heap::Heap(VM* vm, HeapType heapType)
, m_ramSize(ramSize())
, m_minBytesPerCycle(minHeapSize(m_heapType, m_ramSize))
, m_sizeAfterLastCollect(0)
, m_bytesAllocatedLimit(m_minBytesPerCycle)
, m_bytesAllocated(0)
, m_bytesAbandoned(0)
, m_bytesAllocatedThisCycle(0)
, m_bytesAbandonedThisCycle(0)
, m_maxEdenSize(m_minBytesPerCycle)
, m_maxHeapSize(m_minBytesPerCycle)
, m_shouldDoFullCollection(false)
, m_totalBytesVisited(0)
, m_totalBytesCopied(0)
, m_operationInProgress(NoOperation)
......@@ -269,7 +271,7 @@ Heap::Heap(VM* vm, HeapType heapType)
, m_copyVisitor(m_sharedData)
, m_handleSet(vm)
, m_isSafeToCollect(false)
, m_writeBarrierBuffer(128)
, m_writeBarrierBuffer(256)
, m_vm(vm)
, m_lastGCLength(0)
, m_lastCodeDiscardTime(WTF::monotonicallyIncreasingTime())
......@@ -332,8 +334,8 @@ void Heap::reportAbandonedObjectGraph()
void Heap::didAbandon(size_t bytes)
{
if (m_activityCallback)
m_activityCallback->didAllocate(m_bytesAllocated + m_bytesAbandoned);
m_bytesAbandoned += bytes;
m_activityCallback->didAllocate(m_bytesAllocatedThisCycle + m_bytesAbandonedThisCycle);
m_bytesAbandonedThisCycle += bytes;
}
void Heap::protect(JSValue k)
......@@ -487,6 +489,9 @@ void Heap::markRoots()
visitor.setup();
HeapRootVisitor heapRootVisitor(visitor);
Vector<const JSCell*> rememberedSet(m_slotVisitor.markStack().size());
m_slotVisitor.markStack().fillVector(rememberedSet);
{
ParallelModeEnabler enabler(visitor);
......@@ -590,6 +595,14 @@ void Heap::markRoots()
}
}
{
GCPHASE(ClearRememberedSet);
for (unsigned i = 0; i < rememberedSet.size(); ++i) {
const JSCell* cell = rememberedSet[i];
MarkedBlock::blockFor(cell)->clearRemembered(cell);
}
}
GCCOUNTER(VisitedValueCount, visitor.visitCount());
m_sharedData.didFinishMarking();
......@@ -601,8 +614,14 @@ void Heap::markRoots()
MARK_LOG_MESSAGE2("\nNumber of live Objects after full GC %lu, took %.6f secs\n", visitCount, WTF::monotonicallyIncreasingTime() - gcStartTime);
#endif
m_totalBytesVisited = visitor.bytesVisited();
m_totalBytesCopied = visitor.bytesCopied();
if (m_operationInProgress == EdenCollection) {
m_totalBytesVisited += visitor.bytesVisited();
m_totalBytesCopied += visitor.bytesCopied();
} else {
ASSERT(m_operationInProgress == FullCollection);
m_totalBytesVisited = visitor.bytesVisited();
m_totalBytesCopied = visitor.bytesCopied();
}
#if ENABLE(PARALLEL_GC)
m_totalBytesVisited += m_sharedData.childBytesVisited();
m_totalBytesCopied += m_sharedData.childBytesCopied();
......@@ -615,8 +634,12 @@ void Heap::markRoots()
m_sharedData.reset();
}
template <HeapOperation collectionType>
void Heap::copyBackingStores()
{
if (collectionType == EdenCollection)
return;
m_storageSpace.startedCopying();
if (m_storageSpace.shouldDoCopyPhase()) {
m_sharedData.didStartCopying();
......@@ -627,7 +650,7 @@ void Heap::copyBackingStores()
// before signaling that the phase is complete.
m_storageSpace.doneCopying();
m_sharedData.didFinishCopying();
} else
} else
m_storageSpace.doneCopying();
}
......@@ -723,11 +746,22 @@ void Heap::deleteUnmarkedCompiledCode()
m_jitStubRoutines.deleteUnmarkedJettisonedStubRoutines();
}
void Heap::addToRememberedSet(const JSCell* cell)
{
ASSERT(cell);
ASSERT(!Options::enableConcurrentJIT() || !isCompilationThread());
if (isInRememberedSet(cell))
return;
MarkedBlock::blockFor(cell)->setRemembered(cell);
m_slotVisitor.unconditionallyAppend(const_cast<JSCell*>(cell));
}
void Heap::collectAllGarbage()
{
if (!m_isSafeToCollect)
return;
m_shouldDoFullCollection = true;
collect();
SamplingRegion samplingRegion("Garbage Collection: Sweeping");
......@@ -764,9 +798,28 @@ void Heap::collect()
RecursiveAllocationScope scope(*this);
m_vm->prepareToDiscardCode();
}
m_operationInProgress = Collection;
m_extraMemoryUsage = 0;
bool isFullCollection = m_shouldDoFullCollection;
if (isFullCollection) {
m_operationInProgress = FullCollection;
m_slotVisitor.clearMarkStack();
m_shouldDoFullCollection = false;
if (Options::logGC())
dataLog("FullCollection, ");
} else {
#if ENABLE(GGC)
m_operationInProgress = EdenCollection;
if (Options::logGC())
dataLog("EdenCollection, ");
#else
m_operationInProgress = FullCollection;
m_slotVisitor.clearMarkStack();
if (Options::logGC())
dataLog("FullCollection, ");
#endif
}
if (m_operationInProgress == FullCollection)
m_extraMemoryUsage = 0;
if (m_activityCallback)
m_activityCallback->willCollect();
......@@ -780,6 +833,16 @@ void Heap::collect()
{
GCPHASE(StopAllocation);
m_objectSpace.stopAllocating();
if (m_operationInProgress == FullCollection)
m_storageSpace.didStartFullCollection();
}
{
GCPHASE(FlushWriteBarrierBuffer);
if (m_operationInProgress == EdenCollection)
m_writeBarrierBuffer.flush(*this);
else
m_writeBarrierBuffer.reset();
}
markRoots();
......@@ -796,13 +859,16 @@ void Heap::collect()
m_arrayBuffers.sweep();
}
{
if (m_operationInProgress == FullCollection) {
m_blockSnapshot.resize(m_objectSpace.blocks().set().size());
MarkedBlockSnapshotFunctor functor(m_blockSnapshot);
m_objectSpace.forEachBlock(functor);
}
copyBackingStores();
if (m_operationInProgress == FullCollection)
copyBackingStores<FullCollection>();
else
copyBackingStores<EdenCollection>();
{
GCPHASE(FinalizeUnconditionalFinalizers);
......@@ -819,8 +885,15 @@ void Heap::collect()
m_vm->clearSourceProviderCaches();
}
m_sweeper->startSweeping(m_blockSnapshot);
m_bytesAbandoned = 0;
if (m_operationInProgress == FullCollection)
m_sweeper->startSweeping(m_blockSnapshot);
{
GCPHASE(AddCurrentlyExecutingCodeBlocksToRememberedSet);
m_codeBlocks.rememberCurrentlyExecutingCodeBlocks(this);
}
m_bytesAbandonedThisCycle = 0;
{
GCPHASE(ResetAllocators);
......@@ -831,21 +904,32 @@ void Heap::collect()
if (Options::gcMaxHeapSize() && currentHeapSize > Options::gcMaxHeapSize())
HeapStatistics::exitWithFailure();
m_sizeAfterLastCollect = currentHeapSize;
if (m_operationInProgress == FullCollection) {
// To avoid pathological GC churn in very small and very large heaps, we set
// the new allocation limit based on the current size of the heap, with a
// fixed minimum.
m_maxHeapSize = max(minHeapSize(m_heapType, m_ramSize), proportionalHeapSize(currentHeapSize, m_ramSize));
m_maxEdenSize = m_maxHeapSize - currentHeapSize;
} else {
ASSERT(currentHeapSize >= m_sizeAfterLastCollect);
m_maxEdenSize = m_maxHeapSize - currentHeapSize;
double edenToOldGenerationRatio = (double)m_maxEdenSize / (double)m_maxHeapSize;
double minEdenToOldGenerationRatio = 1.0 / 3.0;
if (edenToOldGenerationRatio < minEdenToOldGenerationRatio)
m_shouldDoFullCollection = true;
m_maxHeapSize += currentHeapSize - m_sizeAfterLastCollect;
m_maxEdenSize = m_maxHeapSize - currentHeapSize;
}
// To avoid pathological GC churn in very small and very large heaps, we set
// the new allocation limit based on the current size of the heap, with a
// fixed minimum.
size_t maxHeapSize = max(minHeapSize(m_heapType, m_ramSize), proportionalHeapSize(currentHeapSize, m_ramSize));
m_bytesAllocatedLimit = maxHeapSize - currentHeapSize;
m_sizeAfterLastCollect = currentHeapSize;
m_bytesAllocated = 0;
m_bytesAllocatedThisCycle = 0;
double lastGCEndTime = WTF::monotonicallyIncreasingTime();
m_lastGCLength = lastGCEndTime - lastGCStartTime;
if (Options::recordGCPauseTimes())
HeapStatistics::recordGCPauseTime(lastGCStartTime, lastGCEndTime);
RELEASE_ASSERT(m_operationInProgress == Collection);
RELEASE_ASSERT(m_operationInProgress == EdenCollection || m_operationInProgress == FullCollection);
m_operationInProgress = NoOperation;
JAVASCRIPTCORE_GC_END();
......@@ -863,10 +947,6 @@ void Heap::collect()
double after = currentTimeMS();
dataLog(after - before, " ms, ", currentHeapSize / 1024, " kb]\n");
}
#if ENABLE(ALLOCATION_LOGGING)
dataLogF("JSC GC finishing collection.\n");
#endif
}
bool Heap::collectIfNecessaryOrDefer()
......@@ -916,8 +996,8 @@ void Heap::setGarbageCollectionTimerEnabled(bool enable)
void Heap::didAllocate(size_t bytes)
{
if (m_activityCallback)
m_activityCallback->didAllocate(m_bytesAllocated + m_bytesAbandoned);
m_bytesAllocated += bytes;
m_activityCallback->didAllocate(m_bytesAllocatedThisCycle + m_bytesAbandonedThisCycle);
m_bytesAllocatedThisCycle += bytes;
}
bool Heap::isValidAllocation(size_t)
......@@ -994,6 +1074,15 @@ void Heap::decrementDeferralDepthAndGCIfNeeded()
collectIfNecessaryOrDefer();
}
void Heap::writeBarrier(const JSCell* from)
{
ASSERT_GC_OBJECT_LOOKS_VALID(const_cast<JSCell*>(from));
if (!from || !isMarked(from))
return;
Heap* heap = Heap::heap(from);
heap->addToRememberedSet(from);
}
void Heap::flushWriteBarrierBuffer(JSCell* cell)
{
#if ENABLE(GGC)
......
......@@ -94,11 +94,17 @@ namespace JSC {
static bool testAndSetMarked(const void*);
static void setMarked(const void*);
JS_EXPORT_PRIVATE void addToRememberedSet(const JSCell*);
bool isInRememberedSet(const JSCell* cell) const
{
ASSERT(cell);
ASSERT(!Options::enableConcurrentJIT() || !isCompilationThread());
return MarkedBlock::blockFor(cell)->isRemembered(cell);
}
static bool isWriteBarrierEnabled();
static void writeBarrier(const JSCell*);
JS_EXPORT_PRIVATE static void writeBarrier(const JSCell*);
static void writeBarrier(const JSCell*, JSValue);
static void writeBarrier(const JSCell*, JSCell*);
static uint8_t* addressOfCardFor(JSCell*);
WriteBarrierBuffer& writeBarrierBuffer() { return m_writeBarrierBuffer; }
void flushWriteBarrierBuffer(JSCell*);
......@@ -120,6 +126,7 @@ namespace JSC {
// true if collection is in progress
inline bool isCollecting();
inline HeapOperation operationInProgress() { return m_operationInProgress; }
// true if an allocation or collection is in progress
inline bool isBusy();
......@@ -236,6 +243,7 @@ namespace JSC {
void markRoots();
void markProtectedObjects(HeapRootVisitor&);
void markTempSortVectors(HeapRootVisitor&);
template <HeapOperation collectionType>
void copyBackingStores();
void harvestWeakReferences();
void finalizeUnconditionalFinalizers();
......@@ -257,10 +265,11 @@ namespace JSC {
const size_t m_minBytesPerCycle;
size_t m_sizeAfterLastCollect;
size_t m_bytesAllocatedLimit;
size_t m_bytesAllocated;
size_t m_bytesAbandoned;
size_t m_bytesAllocatedThisCycle;
size_t m_bytesAbandonedThisCycle;
size_t m_maxEdenSize;
size_t m_maxHeapSize;
bool m_shouldDoFullCollection;
size_t m_totalBytesVisited;
size_t m_totalBytesCopied;
......@@ -271,6 +280,8 @@ namespace JSC {
GCIncomingRefCountedSet<ArrayBuffer> m_arrayBuffers;
size_t m_extraMemoryUsage;
HashSet<const JSCell*> m_copyingRememberedSet;
ProtectCountSet m_protectedValues;
Vector<Vector<ValueStringPair, 0, UnsafeVectorOverflow>* > m_tempSortingVectors;
OwnPtr<HashSet<MarkedArgumentBuffer*>> m_markListSet;
......@@ -322,8 +333,8 @@ namespace JSC {
if (isDeferred())
return false;
if (Options::gcMaxHeapSize())
return m_bytesAllocated > Options::gcMaxHeapSize() && m_isSafeToCollect && m_operationInProgress == NoOperation;
return m_bytesAllocated > m_bytesAllocatedLimit && m_isSafeToCollect && m_operationInProgress == NoOperation;
return m_bytesAllocatedThisCycle > Options::gcMaxHeapSize() && m_isSafeToCollect && m_operationInProgress == NoOperation;
return m_bytesAllocatedThisCycle > m_maxEdenSize && m_isSafeToCollect && m_operationInProgress == NoOperation;
}
bool Heap::isBusy()
......@@ -333,7 +344,7 @@ namespace JSC {
bool Heap::isCollecting()
{
return m_operationInProgress == Collection;
return m_operationInProgress == FullCollection || m_operationInProgress == EdenCollection;
}
inline Heap* Heap::heap(const JSCell* cell)
......@@ -370,26 +381,33 @@ namespace JSC {
inline bool Heap::isWriteBarrierEnabled()
{
#if ENABLE(WRITE_BARRIER_PROFILING)
#if ENABLE(WRITE_BARRIER_PROFILING) || ENABLE(GGC)
return true;
#else
return false;
#endif
}
inline void Heap::writeBarrier(const JSCell*)
{
WriteBarrierCounters::countWriteBarrier();
}
inline void Heap::writeBarrier(const JSCell*, JSCell*)
inline void Heap::writeBarrier(const JSCell* from, JSCell* to)
{
#if ENABLE(WRITE_BARRIER_PROFILING)
WriteBarrierCounters::countWriteBarrier();
#endif
if (!from || !isMarked(from))
return;
if (!to || isMarked(to))
return;
Heap::heap(from)->addToRememberedSet(from);
}
inline void Heap::writeBarrier(const JSCell*, JSValue)
inline void Heap::writeBarrier(const JSCell* from, JSValue to)
{
#if ENABLE(WRITE_BARRIER_PROFILING)
WriteBarrierCounters::countWriteBarrier();
#endif
if (!to.isCell())
return;
writeBarrier(from, to.asCell());