Commit 39281e26 authored by ggaren@apple.com's avatar ggaren@apple.com

Made WeakSet per-block instead of per-heap

https://bugs.webkit.org/show_bug.cgi?id=87401

Reviewed by Oliver Hunt.

This allows us fast access to the set of all weak pointers for a block,
which is a step toward lazy finalization.

No performance change.

* heap/Heap.cpp:
(JSC::Heap::Heap):
(JSC::Heap::lastChanceToFinalize): Removed the per-heap weak set, since
it's per-block now.

(JSC::Heap::markRoots): Delegate weak set visiting to the marked space,
since it knows how to iterate all blocks.

(JSC::Heap::collect): Moved the reaping outside of markRoots, since it
doesn't mark anything.

Make sure to reset allocators after shrinking, since shrinking may
deallocate the current allocator.

* heap/Heap.h:
(Heap): No more per-heap weak set, since it's per-block now.

* heap/MarkedBlock.cpp:
(JSC::MarkedBlock::MarkedBlock):
* heap/MarkedBlock.h:
(MarkedBlock):
(JSC::MarkedBlock::lastChanceToFinalize): Migrated finalization logic
here from the heap, so the heap doesn't need to know about our internal
data structures like our weak set.

(JSC::MarkedBlock::heap):
(JSC::MarkedBlock::weakSet):
(JSC::MarkedBlock::shrink):
(JSC::MarkedBlock::resetAllocator):
(JSC::MarkedBlock::visitWeakSet):
(JSC::MarkedBlock::reapWeakSet):
(JSC::MarkedBlock::sweepWeakSet):
* heap/MarkedSpace.cpp:
(JSC::VisitWeakSet::VisitWeakSet):
(JSC::VisitWeakSet::operator()):
(VisitWeakSet):
(JSC):
(JSC::ReapWeakSet::operator()):
(JSC::SweepWeakSet::operator()):
(JSC::LastChanceToFinalize::operator()):
(JSC::MarkedSpace::lastChanceToFinalize):
(JSC::ResetAllocator::operator()):
(JSC::MarkedSpace::resetAllocators):
(JSC::MarkedSpace::visitWeakSets):
(JSC::MarkedSpace::reapWeakSets):
(JSC::MarkedSpace::sweepWeakSets):
(JSC::Shrink::operator()):
(JSC::MarkedSpace::shrink):
* heap/MarkedSpace.h:
(MarkedSpace): Make sure to account for our weak sets when sweeping,
shrinking, etc.

* heap/WeakSet.cpp:
(JSC):
* heap/WeakSet.h:
(WeakSet):
(JSC::WeakSet::heap):
(JSC):
(JSC::WeakSet::lastChanceToFinalize):
(JSC::WeakSet::visit):
(JSC::WeakSet::reap):
(JSC::WeakSet::shrink):
(JSC::WeakSet::resetAllocator): Inlined some things since they're called
once per block now instead of once per heap.

* heap/WeakSetInlines.h:
(JSC::WeakSet::allocate): Use the per-block weak set since there is no
per-heap weak set anymore.


git-svn-id: http://svn.webkit.org/repository/webkit/trunk@118416 268f45cc-cd09-0410-ab3c-d52691b4dbfc
parent 8ca6a7aa
2012-05-24 Geoffrey Garen <ggaren@apple.com>
Made WeakSet per-block instead of per-heap
https://bugs.webkit.org/show_bug.cgi?id=87401
Reviewed by Oliver Hunt.
This allows us fast access to the set of all weak pointers for a block,
which is a step toward lazy finalization.
No performance change.
* heap/Heap.cpp:
(JSC::Heap::Heap):
(JSC::Heap::lastChanceToFinalize): Removed the per-heap weak set, since
it's per-block now.
(JSC::Heap::markRoots): Delegate weak set visiting to the marked space,
since it knows how to iterate all blocks.
(JSC::Heap::collect): Moved the reaping outside of markRoots, since it
doesn't mark anything.
Make sure to reset allocators after shrinking, since shrinking may
deallocate the current allocator.
* heap/Heap.h:
(Heap): No more per-heap weak set, since it's per-block now.
* heap/MarkedBlock.cpp:
(JSC::MarkedBlock::MarkedBlock):
* heap/MarkedBlock.h:
(MarkedBlock):
(JSC::MarkedBlock::lastChanceToFinalize): Migrated finalization logic
here from the heap, so the heap doesn't need to know about our internal
data structures like our weak set.
(JSC::MarkedBlock::heap):
(JSC::MarkedBlock::weakSet):
(JSC::MarkedBlock::shrink):
(JSC::MarkedBlock::resetAllocator):
(JSC::MarkedBlock::visitWeakSet):
(JSC::MarkedBlock::reapWeakSet):
(JSC::MarkedBlock::sweepWeakSet):
* heap/MarkedSpace.cpp:
(JSC::VisitWeakSet::VisitWeakSet):
(JSC::VisitWeakSet::operator()):
(VisitWeakSet):
(JSC):
(JSC::ReapWeakSet::operator()):
(JSC::SweepWeakSet::operator()):
(JSC::LastChanceToFinalize::operator()):
(JSC::MarkedSpace::lastChanceToFinalize):
(JSC::ResetAllocator::operator()):
(JSC::MarkedSpace::resetAllocators):
(JSC::MarkedSpace::visitWeakSets):
(JSC::MarkedSpace::reapWeakSets):
(JSC::MarkedSpace::sweepWeakSets):
(JSC::Shrink::operator()):
(JSC::MarkedSpace::shrink):
* heap/MarkedSpace.h:
(MarkedSpace): Make sure to account for our weak sets when sweeping,
shrinking, etc.
* heap/WeakSet.cpp:
(JSC):
* heap/WeakSet.h:
(WeakSet):
(JSC::WeakSet::heap):
(JSC):
(JSC::WeakSet::lastChanceToFinalize):
(JSC::WeakSet::visit):
(JSC::WeakSet::reap):
(JSC::WeakSet::shrink):
(JSC::WeakSet::resetAllocator): Inlined some things since they're called
once per block now instead of once per heap.
* heap/WeakSetInlines.h:
(JSC::WeakSet::allocate): Use the per-block weak set since there is no
per-heap weak set anymore.
2012-05-24 Gavin Barraclough <barraclough@apple.com>
Fix arm build
......
......@@ -248,7 +248,6 @@ Heap::Heap(JSGlobalData* globalData, HeapType heapType)
, m_machineThreads(this)
, m_sharedData(globalData)
, m_slotVisitor(m_sharedData)
, m_weakSet(this)
, m_handleSet(globalData)
, m_isSafeToCollect(false)
, m_globalData(globalData)
......@@ -278,7 +277,6 @@ void Heap::lastChanceToFinalize()
if (size_t size = m_protectedValues.size())
WTFLogAlways("ERROR: JavaScriptCore heap deallocated while %ld values were still protected", static_cast<unsigned long>(size));
m_weakSet.lastChanceToFinalize();
m_objectSpace.lastChanceToFinalize();
#if ENABLE(SIMPLE_HEAP_PROFILING)
......@@ -561,7 +559,7 @@ void Heap::markRoots(bool fullGC)
{
GCPHASE(VisitingLiveWeakHandles);
while (true) {
m_weakSet.visit(heapRootVisitor);
m_objectSpace.visitWeakSets(heapRootVisitor);
harvestWeakReferences();
if (visitor.isEmpty())
break;
......@@ -575,11 +573,6 @@ void Heap::markRoots(bool fullGC)
}
}
{
GCPHASE(ReapingWeakHandles);
m_weakSet.reap();
}
GCCOUNTER(VisitedValueCount, visitor.visitCount());
visitor.doneCopying();
......@@ -682,25 +675,24 @@ void Heap::collect(SweepToggle sweepToggle)
markRoots(fullGC);
{
GCPHASE(ReapingWeakHandles);
m_objectSpace.reapWeakSets();
}
{
GCPHASE(FinalizeUnconditionalFinalizers);
finalizeUnconditionalFinalizers();
}
{
GCPHASE(FinalizeWeakHandles);
m_weakSet.sweep();
m_objectSpace.sweepWeakSets();
m_globalData->smallStrings.finalizeSmallStrings();
}
JAVASCRIPTCORE_GC_MARKED();
{
GCPHASE(ResetAllocators);
m_objectSpace.resetAllocators();
m_weakSet.resetAllocator();
}
{
GCPHASE(DeleteCodeBlocks);
m_dfgCodeBlocks.deleteUnmarkedJettisonedCodeBlocks();
......@@ -711,10 +703,14 @@ void Heap::collect(SweepToggle sweepToggle)
GCPHASE(Sweeping);
m_objectSpace.sweep();
m_objectSpace.shrink();
m_weakSet.shrink();
m_bytesAbandoned = 0;
}
{
GCPHASE(ResetAllocators);
m_objectSpace.resetAllocators();
}
size_t currentHeapSize = size();
if (fullGC) {
m_sizeAfterLastCollect = currentHeapSize;
......
......@@ -32,7 +32,6 @@
#include "MarkedSpace.h"
#include "SlotVisitor.h"
#include "WeakHandleOwner.h"
#include "WeakSet.h"
#include "WriteBarrierSupport.h"
#include <wtf/HashCountedSet.h>
#include <wtf/HashSet.h>
......@@ -148,7 +147,6 @@ namespace JSC {
template<typename Functor> typename Functor::ReturnType forEachProtectedCell(Functor&);
template<typename Functor> typename Functor::ReturnType forEachProtectedCell();
WeakSet* weakSet() { return &m_weakSet; }
HandleSet* handleSet() { return &m_handleSet; }
HandleStack* handleStack() { return &m_handleStack; }
......@@ -225,7 +223,6 @@ namespace JSC {
MarkStackThreadSharedData m_sharedData;
SlotVisitor m_slotVisitor;
WeakSet m_weakSet;
HandleSet m_handleSet;
HandleStack m_handleStack;
DFGCodeBlocks m_dfgCodeBlocks;
......
......@@ -52,7 +52,7 @@ MarkedBlock::MarkedBlock(const PageAllocationAligned& allocation, Heap* heap, si
, m_endAtom(atomsPerBlock - m_atomsPerCell + 1)
, m_cellsNeedDestruction(cellsNeedDestruction)
, m_state(New) // All cells start out unmarked.
, m_heap(heap)
, m_weakSet(heap)
{
ASSERT(heap);
HEAP_LOG_BLOCK_STATE_TRANSITION(this);
......
......@@ -25,6 +25,7 @@
#include "CardSet.h"
#include "HeapBlock.h"
#include "WeakSet.h"
#include <wtf/Bitmap.h>
#include <wtf/DataLog.h>
#include <wtf/DoublyLinkedList.h>
......@@ -119,13 +120,21 @@ namespace JSC {
static MarkedBlock* blockFor(const void*);
static size_t firstAtom();
void lastChanceToFinalize();
Heap* heap() const;
WeakSet& weakSet();
void* allocate();
enum SweepMode { SweepOnly, SweepToFreeList };
FreeList sweep(SweepMode = SweepOnly);
void shrink();
void resetAllocator();
void visitWeakSet(HeapRootVisitor&);
void reapWeakSet();
void sweepWeakSet();
// While allocating from a free list, MarkedBlock temporarily has bogus
// cell liveness data. To restore accurate cell liveness data, call one
// of these functions:
......@@ -205,7 +214,7 @@ namespace JSC {
#endif
bool m_cellsNeedDestruction;
BlockState m_state;
Heap* m_heap;
WeakSet m_weakSet;
};
inline MarkedBlock::FreeList::FreeList()
......@@ -240,9 +249,47 @@ namespace JSC {
return reinterpret_cast<MarkedBlock*>(reinterpret_cast<Bits>(p) & blockMask);
}
inline void MarkedBlock::lastChanceToFinalize()
{
m_weakSet.lastChanceToFinalize();
clearMarks();
sweep();
}
inline Heap* MarkedBlock::heap() const
{
return m_heap;
return m_weakSet.heap();
}
inline WeakSet& MarkedBlock::weakSet()
{
return m_weakSet;
}
inline void MarkedBlock::shrink()
{
m_weakSet.shrink();
}
inline void MarkedBlock::resetAllocator()
{
m_weakSet.resetAllocator();
}
inline void MarkedBlock::visitWeakSet(HeapRootVisitor& heapRootVisitor)
{
m_weakSet.visit(heapRootVisitor);
}
inline void MarkedBlock::reapWeakSet()
{
m_weakSet.reap();
}
inline void MarkedBlock::sweepWeakSet()
{
m_weakSet.sweep();
}
inline void MarkedBlock::didConsumeFreeList()
......@@ -272,7 +319,7 @@ namespace JSC {
inline bool MarkedBlock::isEmpty()
{
return m_marks.isEmpty();
return m_marks.isEmpty() && m_weakSet.isEmpty();
}
inline size_t MarkedBlock::cellSize()
......
......@@ -66,6 +66,21 @@ inline Take::ReturnType Take::returnValue()
return m_blocks.head();
}
struct VisitWeakSet : MarkedBlock::VoidFunctor {
VisitWeakSet(HeapRootVisitor& heapRootVisitor) : m_heapRootVisitor(heapRootVisitor) { }
void operator()(MarkedBlock* block) { block->visitWeakSet(m_heapRootVisitor); }
private:
HeapRootVisitor& m_heapRootVisitor;
};
struct ReapWeakSet : MarkedBlock::VoidFunctor {
void operator()(MarkedBlock* block) { block->reapWeakSet(); }
};
struct SweepWeakSet : MarkedBlock::VoidFunctor {
void operator()(MarkedBlock* block) { block->sweepWeakSet(); }
};
MarkedSpace::MarkedSpace(Heap* heap)
: m_heap(heap)
{
......@@ -87,13 +102,20 @@ MarkedSpace::~MarkedSpace()
freeBlocks(forEachBlock(take));
}
struct LastChanceToFinalize : MarkedBlock::VoidFunctor {
void operator()(MarkedBlock* block) { block->lastChanceToFinalize(); }
};
void MarkedSpace::lastChanceToFinalize()
{
canonicalizeCellLivenessData();
clearMarks();
sweep();
forEachBlock<LastChanceToFinalize>();
}
struct ResetAllocator : MarkedBlock::VoidFunctor {
void operator()(MarkedBlock* block) { block->resetAllocator(); }
};
void MarkedSpace::resetAllocators()
{
for (size_t cellSize = preciseStep; cellSize <= preciseCutoff; cellSize += preciseStep) {
......@@ -105,6 +127,24 @@ void MarkedSpace::resetAllocators()
allocatorFor(cellSize).reset();
destructorAllocatorFor(cellSize).reset();
}
forEachBlock<ResetAllocator>();
}
void MarkedSpace::visitWeakSets(HeapRootVisitor& heapRootVisitor)
{
VisitWeakSet visitWeakSet(heapRootVisitor);
forEachBlock(visitWeakSet);
}
void MarkedSpace::reapWeakSets()
{
forEachBlock<ReapWeakSet>();
}
void MarkedSpace::sweepWeakSets()
{
forEachBlock<SweepWeakSet>();
}
void MarkedSpace::canonicalizeCellLivenessData()
......@@ -148,11 +188,17 @@ void MarkedSpace::freeBlocks(MarkedBlock* head)
}
}
struct Shrink : MarkedBlock::VoidFunctor {
void operator()(MarkedBlock* block) { block->shrink(); }
};
void MarkedSpace::shrink()
{
// We record a temporary list of empties to avoid modifying m_blocks while iterating it.
Take takeIfEmpty(Take::TakeIfEmpty, this);
freeBlocks(forEachBlock(takeIfEmpty));
forEachBlock<Shrink>();
}
#if ENABLE(GGC)
......
......@@ -83,6 +83,10 @@ public:
void resetAllocators();
void visitWeakSets(HeapRootVisitor&);
void reapWeakSets();
void sweepWeakSets();
MarkedBlockSet& blocks() { return m_blocks; }
void canonicalizeCellLivenessData();
......
......@@ -40,24 +40,6 @@ WeakSet::~WeakSet()
m_blocks.clear();
}
void WeakSet::lastChanceToFinalize()
{
for (WeakBlock* block = m_blocks.head(); block; block = block->next())
block->lastChanceToFinalize();
}
void WeakSet::visit(HeapRootVisitor& visitor)
{
for (WeakBlock* block = m_blocks.head(); block; block = block->next())
block->visit(visitor);
}
void WeakSet::reap()
{
for (WeakBlock* block = m_blocks.head(); block; block = block->next())
block->reap();
}
void WeakSet::sweep()
{
WeakBlock* next;
......@@ -73,23 +55,6 @@ void WeakSet::sweep()
}
}
void WeakSet::shrink()
{
WeakBlock* next;
for (WeakBlock* block = m_blocks.head(); block; block = next) {
next = block->next();
if (block->isEmpty())
removeAllocator(block);
}
}
void WeakSet::resetAllocator()
{
m_allocator = 0;
m_nextAllocator = m_blocks.head();
}
WeakBlock::FreeCell* WeakSet::findAllocator()
{
if (WeakBlock::FreeCell* allocator = tryFindAllocator())
......
......@@ -42,13 +42,15 @@ public:
~WeakSet();
void lastChanceToFinalize();
Heap* heap() const;
bool isEmpty() const;
void visit(HeapRootVisitor&);
void reap();
void sweep();
void resetAllocator();
void shrink();
void resetAllocator();
private:
JS_EXPORT_PRIVATE WeakBlock::FreeCell* findAllocator();
......@@ -69,11 +71,61 @@ inline WeakSet::WeakSet(Heap* heap)
{
}
inline Heap* WeakSet::heap() const
{
return m_heap;
}
inline bool WeakSet::isEmpty() const
{
for (WeakBlock* block = m_blocks.head(); block; block = block->next()) {
if (!block->isEmpty())
return false;
}
return true;
}
inline void WeakSet::deallocate(WeakImpl* weakImpl)
{
weakImpl->setState(WeakImpl::Deallocated);
}
inline void WeakSet::lastChanceToFinalize()
{
for (WeakBlock* block = m_blocks.head(); block; block = block->next())
block->lastChanceToFinalize();
}
inline void WeakSet::visit(HeapRootVisitor& visitor)
{
for (WeakBlock* block = m_blocks.head(); block; block = block->next())
block->visit(visitor);
}
inline void WeakSet::reap()
{
for (WeakBlock* block = m_blocks.head(); block; block = block->next())
block->reap();
}
inline void WeakSet::shrink()
{
WeakBlock* next;
for (WeakBlock* block = m_blocks.head(); block; block = next) {
next = block->next();
if (block->isEmpty())
removeAllocator(block);
}
}
inline void WeakSet::resetAllocator()
{
m_allocator = 0;
m_nextAllocator = m_blocks.head();
}
} // namespace JSC
#endif // WeakSet_h
......@@ -32,7 +32,7 @@ namespace JSC {
inline WeakImpl* WeakSet::allocate(JSValue jsValue, WeakHandleOwner* weakHandleOwner, void* context)
{
WeakSet& weakSet = *Heap::heap(jsValue.asCell())->weakSet();
WeakSet& weakSet = MarkedBlock::blockFor(jsValue.asCell())->weakSet();
WeakBlock::FreeCell* allocator = weakSet.m_allocator;
if (UNLIKELY(!allocator))
allocator = weakSet.findAllocator();
......
Markdown is supported
0% or .
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment