Testcase failed to convert codes from Objective-C to Swift

I perform Swift-style bitwise operations, which these codes are initially written in Objective-C / C. I use UnsafeMutablePointer to specify the starting index of a memory address and use UnsafeMutableBufferPointer to access an element inside a scope.

You can get the source Objective-C file here.

 public init(size: Int) { self.size = size self.bitsLength = (size + 31) / 32 self.startIdx = UnsafeMutablePointer<Int32>.alloc(bitsLength * sizeof(Int32)) self.bits = UnsafeMutableBufferPointer(start: startIdx, count: bitsLength) } /** * @param from first bit to check * @return index of first bit that is set, starting from the given index, or size if none are set * at or beyond its given index */ public func nextSet(from: Int) -> Int { if from >= size { return size } var bitsOffset = from / 32 var currentBits: Int32 = bits[bitsOffset] currentBits &= ~((1 << (from & 0x1F)) - 1).to32 while currentBits == 0 { if ++bitsOffset == bitsLength { return size } currentBits = bits[bitsOffset] } let result: Int = bitsOffset * 32 + numberOfTrailingZeros(currentBits).toInt return result > size ? size : result } func numberOfTrailingZeros(i: Int32) -> Int { var i = i guard i != 0 else { return 32 } var n = 31 var y: Int32 y = i << 16 if y != 0 { n = n - 16; i = y } y = i << 8 if y != 0 { n = n - 8; i = y } y = i << 4 if y != 0 { n = n - 4; i = y } y = i << 2 if y != 0 { n = n - 2; i = y } return n - Int((UInt((i << 1)) >> 31)) } 

TestCase:

 func testGetNextSet1() { // Passed var bits = BitArray(size: 32) for i in 0..<bits.size { XCTAssertEqual(32, bits.nextSet(i), "\(i)") } // Failed bits = BitArray(size: 34) for i in 0..<bits.size { XCTAssertEqual(34, bits.nextSet(i), "\(i)") } } 

Can someone explain to me why the second test test fails, but the version of Objective-C passes?

Edit: As @vacawama pointed out: If you break testGetNextSet into 2 tests, pass both.

Edit2: When I run the tests using xctool , and the tests that call nextSet() when BitArray nextSet() will work while working.

+5
source share
2 answers

Objective-C version of numberOfTrailingZeros :

 // Ported from OpenJDK Integer.numberOfTrailingZeros implementation - (int32_t)numberOfTrailingZeros:(int32_t)i { int32_t y; if (i == 0) return 32; int32_t n = 31; y = i <<16; if (y != 0) { n = n -16; i = y; } y = i << 8; if (y != 0) { n = n - 8; i = y; } y = i << 4; if (y != 0) { n = n - 4; i = y; } y = i << 2; if (y != 0) { n = n - 2; i = y; } return n - (int32_t)((uint32_t)(i << 1) >> 31); } 

When translating numberOfTrailingZeros you changed the return value from Int32 to Int . This is normal, but the last line of the function does not work correctly when you translated it.

In numberOfTrailingZeros replace this:

 return n - Int((UInt((i << 1)) >> 31)) 

Wherein:

 return n - Int(UInt32(bitPattern: i << 1) >> 31) 

UInt32 to UInt32 deletes everything except the lower 32 bits. Since you selected UInt , you did not delete these bits. To do this, use bitPattern .

+3
source

Finally, I found out that startIdx just needs to be initialized after allocation.

 self.startIdx = UnsafeMutablePointer<Int32>.alloc(bitsLength * sizeof(Int32)) self.startIdx.initializeFrom(Array(count: bitsLength, repeatedValue: 0)) 

Or use calloc with only one string code:

 self.startIdx = unsafeBitCast(calloc(bitsLength, sizeof(Int32)), UnsafeMutablePointer<Int32>.self) 

In addition, I use lazy var to delay initialization of UnsafeMutableBufferPointer until the property is first used.

 lazy var bits: UnsafeMutableBufferPointer<Int32> = { return UnsafeMutableBufferPointer<Int32>(start: self.startIdx, count: self.bitsLength) }() 

On the other hand, don't forget deinit :

 deinit { startIdx.destroy() startIdx.dealloc(bitsLength * sizeof(Int32)) } 
0
source

Source: https://habr.com/ru/post/1247255/


All Articles