PROBLEM LINK:

Practice
Div-2 Contest
Div-1 Contest

Author and Editorialist: Simon St James
Tester: Suchan Park

DIFFICULTY:

Hard TODO - revisit this - seems to be easier than MOVCOIN2

PREREQUISITES:

Persistence, Graph Theory, AVL Tree, Segment Tree

PROBLEM:

Given a rooted tree GG, find the cithc_i^{\text{th}} element in a list LL of pairs of nodes (u,v)(u,v) that form a valid reparenting, with a specific ordering on LL, and remove it from LL, with each cic_i being processed online. A valid reparenting is a (u,v)(u,v) where severing uu from its parent in GG and adding an edge between uu and vv results in a tree.

QUICK EXPLANATION:

The reparenting (u,v)(u,v) is valid if and only if vv is not a descendant of uu. As we find each reparenting, we simulate its removal from LL by mapping each cic_i to point to the corresponding index XiX_i in the original LL using e.g. an order-statistic tree, so the problem boils down to finding each requested element (ui,vi)(u_i, v_i) at a given index XiX_i in the original LL, online.

The solution is broken down into three "phases", one for of the three clauses in the ordering:

Phase One: Find uiu_i (the π‘›π‘œπ‘‘π‘’π‘‡π‘œπ‘…π‘’π‘π‘Žπ‘Ÿπ‘’π‘›π‘‘\textit{nodeToReparent}). For each node uu in order, find the number of vv's such that (u,v)(u,v) is a valid reparenting, and form a prefix sum array from this. The first element in this array strictly greater than XiX_i gives our uiu_i, and can be found via binary search.

Phase Two: Find the π‘›π‘’π‘€π‘ƒπ‘Žπ‘Ÿπ‘’π‘›π‘‘π»π‘’π‘–π‘”h𝑑\textit{newParentHeight} i.e. h𝑒𝑖𝑔h𝑑(vi)\textit{height}(v_i). We know uiu_i and need to find the Yi𝑑hY_i^\textit{th} element in LL that reparents uiu_i, where YiY_i is easily computed. Using some simple observations and pre-computed lookups, we can compute the number of non-descendants of uiu_i at any given height hh (π‘“π‘–π‘›π‘‘π‘π‘’π‘šπ‘π‘œπ‘›π·π‘’π‘ π‘π‘’π‘›π‘‘π‘Žπ‘›π‘‘π‘ π‘ˆπ‘π‘‡π‘œπ»π‘’π‘–π‘”h𝑑(ui,h)\textit{findNumNonDescendantsUpToHeight}(u_i, h)) in π’ͺ(logN)\mathcal{O}(\log N). Similarly to Phase One, we can then perform an Abstract Binary Search on hh to find the correct h𝑒𝑖𝑔h𝑑(vi)\textit{height}(v_i).

Phase Three: Find the final viv_i (π‘›π‘’π‘€π‘ƒπ‘Žπ‘Ÿπ‘’π‘›π‘‘\textit{newParent}). We know uiu_i and h𝑒𝑖𝑔h𝑑(vi)\textit{height}(v_i), and now just need to find viv_i which is the ZithZ_i^\text{th} node at height h𝑒𝑖𝑔h𝑑(vi)\textit{height}(v_i) that is not a descendant of uiu_i, where the index ZiZ_i is easily computed. If we list all nodes with the given height in the order they are visited in a DFS (𝑁𝐻𝐷𝐹𝑆(h)\textit{NHDFS}(h)) in another precomputation step, we see that the descendents of uiu_i at height h𝑒𝑖𝑔h𝑑(vi)\textit{height}(v_i) all lie in an easily-found interval (l,r)(l,r) in 𝑁𝐻𝐷𝐹𝑆(h𝑒𝑖𝑔h𝑑(vi))\textit{NHDFS}(\textit{height}(v_i)). By forming persistent AVL Trees of the prefixes and suffixes of 𝑁𝐻𝐷𝐹𝑆(h)\textit{NHDFS}(h) for each hh in the precomputation step, we can find in π’ͺ(1)\mathcal{O}(1) a pair of AVL Trees representing all (sorted) nodes to the left of ll and all to the right of rr in 𝑁𝐻𝐷𝐹𝑆(h𝑒𝑖𝑔h𝑑(vi))\textit{NHDFS}(\textit{height}(v_i)), respectively. We can then adapt the standard algorithm for finding the ZithZ_i^\text{th} element in two sorted arrays to work with AVL Trees and find the ZithZ_i^\text{th} node not in (l,r)(l,r) i.e. viv_i.

EXPLANATION:

As is often the case, my solution is rather clunkier than other people's (the approach to Phase Three in particular), but I'm going to with my original solution regardless :)

As mentioned in the Brief Explanation, the pair (u,v)(u,v) is a valid reparenting if and only if vv is not a descendent of uu in the (rooted) tree GG. The size of the list LL of valid reparentings is π’ͺ(N2)\mathcal{O}(N^2); far too big to construct, let alone to erase from for all QQ queries, so we have to be a bit more cunning!

Again as mentioned, we don't actually remove elements from LL; instead we track the indices of the elements of LL that we've removed and use this information to map each new cic_i to its corresponding index XiX_i in the original list LL. Most people seem to use gcc's internal __gnu_pbds::tree tree for this, though since I was writing a persistent AVL Tree anyway I used that to implement the tracking and mapping; see the 𝐼𝑛𝑑𝑒π‘₯π‘…π‘’π‘šπ‘Žπ‘π‘π‘’π‘Ÿ\textit{IndexRemapper} class in my code.

Having sidestepped the issue of removing elements from LL, the problem becomes "find the Xi𝑑hX_i^{\textit{th}} element in the original LL, processing each XiX_i online". I haven't actually tried it, but I suspect that removing the requirement that the elements be found online would lead to a significantly easier problem.

Anyway, onto the first sub-problem, Phase One: finding uiu_i (π‘›π‘œπ‘‘π‘’π‘‡π‘œπ‘…π‘’π‘π‘Žπ‘Ÿπ‘’π‘›π‘‘\textit{nodeToReparent} in the code) of the remapped cic_i, XiX_i, in the original list LL, without constructing LL!

There are no reparentings with u=1u=1, so the first few elements of LL are taken up by the valid reparentings that reparent node 22; then the next few are those that reparent node 33, etc. For a given uu, the number of valid reparentings (u,v)(u,v) that reparent uu is simply the number of vv such that vv is not a descendent of uu i.e. Nβˆ’u.π‘›π‘’π‘šπ·π‘’π‘ π‘π‘’π‘›π‘‘π‘Žπ‘›π‘‘π‘ N-u.\textit{numDescendants}. We compute u.π‘›π‘’π‘šπ·π‘’π‘ π‘π‘’π‘›π‘‘π‘Žπ‘›π‘‘π‘ u.\textit{numDescendants} for all uu in π’ͺ(N)\mathcal{O}(N) in a precomputation step, and then create a prefix sum array 𝐢𝑅𝑃𝑆\textit{CRPS} (π‘›π‘’π‘šπΆπ‘Žπ‘›π‘…π‘’π‘π‘Žπ‘Ÿπ‘’π‘›π‘‘π‘‡π‘œπ‘ƒπ‘Ÿπ‘’π‘“π‘–π‘₯π‘†π‘’π‘š\textit{numCanReparentToPrefixSum} in the code) such that 𝐢𝑅𝑃𝑆(u)\textit{CRPS}(u) is the total number of valid reparentings that reparent a node xx with x≀ux \le u.

As an example, here is the LL used in example test case 2 of the Problem, adjusted so that the indices are 0-relative:

LL for Example Testcase 2

index uu vv h(v)h(v)
0. 2 1 0
1. 2 5 1
2. 2 6 1
3. 2 7 1
4. 2 3 2
5. 2 4 2
6. 3 1 0
7. 3 2 1
8. 3 5 1
9. 3 6 1
10. 3 7 1
11. 3 4 2
12. 4 1 0
13. 4 2 1
14. 4 5 1
15. 4 6 1
16. 4 7 1
17. 4 3 2
18. 5 1 0
19. 5 2 1
20. 5 6 1
21. 5 7 1
22. 6 1 0
23. 6 2 1
24. 6 5 1
25. 6 7 1
26. 6 3 2
27. 6 4 2
28. 7 1 0
29. 7 2 1
30. 7 5 1
31. 7 6 1
32. 7 3 2
33. 7 4 2

and here is the table of 𝐢𝑅𝑃𝑆(u)\textit{CRPS}(u) for each uu for this example, in order (u=1u=1 is omitted):

uu 𝐢𝑅𝑃𝑆(u)\textit{CRPS}(u)
2 6
3 12
4 18
5 22
6 28
7 34

If we look at the 17th17^\text{th} (0-relative!) element in LL, we see that it reparents the node 44.
If we look at the 18th18^\text{th} element in LL, we see that it reparents the node 55.
If we look at the 28th28^\text{th} element in LL, we see that it reparents the node 77.
In general, hopefully the pattern is clear - the node reparented by the XithX_i^\text{th} element of LL is the first uu such that 𝐢𝑅𝑃𝑆(u)>Xi\textit{CRPS}(u) > X_i. Since 𝐢𝑅𝑃𝑆\textit{CRPS} is non-increasing, we can easily find this uu using a binary search, and so find our uiu_i in π’ͺ(logN)\mathcal{O}(\log N), fulfilling Phase One.

Now that we know uiu_i, we can restrict our attention to the sub-list of LL of reparentings that reparent uiu_i; our final desired parenting is at some index YiY_i in this sublist. How do we find YiY_i? In focussing on this sublist, we are ignoring all the first elements of LL that reparent a node x<uix < u_i, so we must subtract this number from XiX_i. By definition, this number is 𝐢𝑅𝑃𝑆(uiβˆ’1)\textit{CRPS}(u_i - 1). The index YiY_i is called π‘›π‘’π‘šπ‘‚π‘“π‘…π‘’π‘π‘Žπ‘Ÿπ‘’π‘›π‘‘π‘–π‘›π‘”π‘‡hπ‘Žπ‘‘π‘…π‘’π‘π‘Žπ‘Ÿπ‘’π‘›π‘‘π‘ π‘π‘œπ‘‘π‘’\textit{numOfReparentingThatReparentsNode} in the code; I'll be sticking with YiY_i here, for obvious reasons :)

So: for Phase Two, we need to find the height of viv_i (π‘›π‘’π‘€π‘ƒπ‘Žπ‘Ÿπ‘’π‘›π‘‘π»π‘’π‘–π‘”h𝑑\textit{newParentHeight} in the code), which is the height of vv in the Yi𝑑hY_i^\textit{th} valid reparenting that reparents uiu_i. In Phase One, we made a tally of all valid reparentings that reparented a node less than or equal to xx, and found the first xx such that this tally exceeded XiX_i; we do a similar trick here in finding the number of reparentings that reparent uiu_i to a node with height less than or equal to hh, π‘π·π‘ˆπ»(ui,h)\textit{NDUH}(u_i, h) (π‘“π‘–π‘›π‘‘π‘π‘’π‘šπ‘π‘œπ‘›π·π‘’π‘ π‘π‘’π‘›π‘‘π‘Žπ‘›π‘‘π‘ π‘ˆπ‘π‘‡π‘œπ»π‘’π‘–π‘”h𝑑\textit{findNumNonDescendantsUpToHeight} in the code) and find the first hh such that π‘π·π‘ˆπ»(ui,h)\textit{NDUH}(u_i, h) exceeds YiY_i. For working out how to compute π‘π·π‘ˆπ»(ui,h)\textit{NDUH}(u_i, h), consider the following schematic (click to animate):

Thus, π‘π·π‘ˆπ»(ui,h)=|A|βˆ’|AD|=|A|βˆ’|D|+|DH|\textit{NDUH}(u_i, h)=|A|-|AD|=|A|-|D|+|DH|. |A||A| and |D||D| are trivial to compute so it remains to compute this sum of proper descendents.

Now, in the precomputation step, we perform a DFS and use the common technique of logging, for each node xx, the "time" at which we first visit xx (x.𝑑𝑓𝑠𝐡𝑒𝑔𝑖𝑛𝑉𝑖𝑠𝑖𝑑x.\textit{dfsBeginVisit}) and the time at which we finish exploring xx (x.𝑑𝑓𝑠𝐸𝑛𝑑𝑉𝑖𝑠𝑖𝑑x.\textit{dfsEndVisit}). We then form a list, for each hh, of all nodes xx with x.h𝑒𝑖𝑔h𝑑=hx.\textit{height} = h ordered by their 𝑑𝑓𝑠𝐡𝑒𝑔𝑖𝑛𝑉𝑖𝑠𝑖𝑑\textit{dfsBeginVisit}, 𝑁𝐻𝐷𝐹𝑆(h)\textit{NHDFS}(h) (called π‘›π‘œπ‘‘π‘’π‘ π΄π‘‘π»π‘’π‘–π‘”hπ‘‘πΌπ‘›π·πΉπ‘†π‘‚π‘Ÿπ‘‘π‘’π‘Ÿ\textit{nodesAtHeightInDFSOrder} in the code). 𝑁𝐻𝐷𝐹𝑆\textit{NHDFS} will come in useful for Phase Three, too.

A well-known fact is that yy is a descendant of xx if and only if y.𝑑𝑓𝑠𝐡𝑒𝑔𝑖𝑛𝑉𝑖𝑠𝑖𝑑>x.𝑑𝑓𝑠𝐡𝑒𝑔𝑖𝑛𝑉𝑖𝑠𝑖𝑑y.\textit{dfsBeginVisit} > x.\textit{dfsBeginVisit} and y.𝑑𝑓𝑠𝐸𝑛𝑑𝑉𝑖𝑠𝑖𝑑<x.𝑑𝑓𝑠𝐸𝑛𝑑𝑉𝑖𝑠𝑖𝑑y.\textit{dfsEndVisit} < x.\textit{dfsEndVisit} and as a consequence, we see that the set of nodes at height hh that are descendents of a node xx form an interval in 𝑁𝐻𝐷𝐹𝑆(h)\textit{NHDFS}(h), and this interval [l,r][l, r] can be found via a binary search on 𝑁𝐻𝐷𝐹𝑆(h)\textit{NHDFS}(h) (see π‘‘π‘’π‘ π‘π‘’π‘›π‘‘π‘Žπ‘›π‘‘π‘…π‘Žπ‘›π‘”π‘’πΉπ‘œπ‘Ÿ\textit{descendantRangeFor} in the code). To compute 𝐷𝐻\textit{DH} we need to find the sum of the proper descendents of nodes in an interval of 𝑁𝐻𝐷𝐹𝑆(h)\textit{NHDFS}(h), which can be done by pre-computing prefix arrays (π‘›π‘’π‘šπ‘ƒπ‘Ÿπ‘œπ‘π‘’π‘Ÿπ·π‘’π‘ π‘π‘’π‘›π‘‘π‘Žπ‘›π‘‘π‘ πΉπ‘œπ‘Ÿπ‘π‘œπ‘‘π‘’π΄π‘‘π»π‘’π‘–π‘”hπ‘‘π‘ƒπ‘Ÿπ‘’π‘“π‘–π‘₯π‘†π‘’π‘š\textit{numProperDescendantsForNodeAtHeightPrefixSum} in the code).

So given a uiu_i and a height hh, we can now compute π‘π·π‘ˆπ»(ui,h)\textit{NDUH}(u_i, h) in π’ͺ(logN)\mathcal{O}(\log N), and we need to find the first hh such that this value exceeds YiY_i. We could list all such π‘π·π‘ˆπ»\textit{NDUH} for each height hh, but this would result in an π’ͺ(N2)\mathcal{O}(N^2) algorithm, so instead we use the fact that π‘π·π‘ˆπ»\textit{NDUH} is non-decreasing with hh to perform an Abstract Binary Search on hh, finally finding the object of Phase Two: h𝑒𝑖𝑔h𝑑(vi)\textit{height}(v_i).

Now onto Phase Three. We now know uiu_i and h𝑒𝑖𝑔h𝑑(vi)\textit{height}(v_i), and want to find viv_i itself (called π‘›π‘’π‘€π‘ƒπ‘Žπ‘Ÿπ‘’π‘›π‘‘\textit{newParent} in the code), which is the ZithZ_i^{\text{th}} element in the sublist of LL that reparents uiu_i to a node with height h𝑒𝑖𝑔h𝑑(vi)\textit{height}(v_i). In restricting our attention to this sub-list, we are skipping all elements that reparent uiu_i to a parent with height less than h𝑒𝑖𝑔h𝑑(vi)\textit{height}(v_i), so Zi=Yiβˆ’π‘π·π‘ˆπ»(ui,h𝑒𝑖𝑔h𝑑(vi)βˆ’1)Z_i = Y_i - \textit{NDUH}(u_i, \textit{height}(v_i) - 1). If h𝑒𝑖𝑔h𝑑(vi)>0\textit{height}(v_i)>0, of course :) ZiZ_i is called π‘›π‘’π‘šπ‘‚π‘“π‘…π‘’π‘π‘Žπ‘Ÿπ‘’π‘›π‘‘π‘–π‘›π‘”πΉπ‘œπ‘Ÿπ‘π‘œπ‘‘π‘’π΄π‘›π‘‘π‘π‘’π‘€π»π‘’π‘–π‘”h𝑑\textit{numOfReparentingForNodeAndNewHeight} in the code.

This is a slightly harder sub-problem to solve than Phase Two, but thankfully a bit easier to explain. Let H=𝑁𝐻𝐷𝐹𝑆(h)H=\textit{NHDFS}(h). Recall that the set of nodes at height hh that are descendents of uiu_i form an interval [l,r][l, r] in HH. If we formed a list consisting of the first lβˆ’1l - 1 elements of HH and the last rβˆ’1r - 1 elements of HH, sorted it, and found the ZithZ_i^\text{th} element in the sorted list, then we'd have our final viv_i. This would be far too slow, however.

If we could get a sorted array of the first lβˆ’1l-1 elements of HH and a sorted array of the last rβˆ’1r-1 elements, we could use the well-known Find the kthk^{\text{th}} Element of Two Sorted Arrays algorithm to find the required ZithZ_i^\text{th} element, but obtaining these lists would still be too slow. We can do something similar, though, using persistent AVL Trees.

Recall that persistent (or versioned) data structures are structures that are in a certain sense immutable: when we appear to change them, we in fact bump the version number and make the change on the new version of the structure, leaving the previous version unchanged and accessible via the prior version number. For Persistent AVL Trees, we can still insert a value in π’ͺ(logN)\mathcal{O}(\log N) but have the added bonus that we can jump to any previous revision (one revision per insertion) of the tree in π’ͺ(1)\mathcal{O}(1).

How does this help? Imagine, in our precomputation step, for each hh, we created an AVL Tree, π‘π‘Ÿπ‘’π‘“π‘–π‘₯π‘’π‘ πΉπ‘œπ‘Ÿπ»π‘’π‘–π‘”h𝑑(h)\textit{prefixesForHeight}(h). The first revision (version 00) of this tree is empty. We then insert the first value of H=𝑁𝐻𝐷𝐹𝑆(h)H=\textit{NHDFS}(h) into the tree; version 11 contains the first element (the prefix of size one) of HH, in sorted order. We then insert the second element of HH; version 22 of the tree contains the first two elements (the prefix of size two) of HH, again in sorted order. We continue until all elements of HH are added, and then create a similar tree, 𝑠𝑒𝑓𝑓𝑖π‘₯π‘’π‘ πΉπ‘œπ‘Ÿπ»π‘’π‘–π‘”h𝑑(h)\textit{suffixesForHeight}(h). Revision 00 of this tree is empty, and for revision one we add the last element of HH (the suffix of size one). Then we get revision two by adding the last-but-one element of HH, and so on until all elements have been added.

TODO - insert the Epic manim animation illustrating all this when I finally get it done!

Now, this doesn't give us the sorted array of the first lβˆ’1l-1 elements of HH or the last rβˆ’1r-1 elements of HH, but it does give us, in π’ͺ(1)\mathcal{O}(1), a pair of trees representing this pair of sorted arrays. We can now adapt the "Find the kthk^{\text{th}} Element of Two Sorted Arrays" to work with AVL Trees instead of arrays to find the object of Phase Three, viv_i (see 𝑓𝑖𝑛𝑑𝐾𝑑hπΉπ‘Ÿπ‘œπ‘šπ‘ƒπ‘Žπ‘–π‘Ÿ\textit{findKthFromPair} in the code).

And that's it!

Complexity Analysis

ALTERNATE EXPLANATION:

Most people solved Phases One and Two in the same way, and most people used some form of persistence for Phase Three, but not many people used AVL Trees: instead, they opted for Persistent Segment Trees (see e.g. the Setter's solution).

There were one or two solutions using Wavelet Trees, and a few using Merge Sort Trees.

SOLUTIONS:

Setter's Solution (C++)

**TODO - use a publically accessible link to the following code: http://campus.codechef.com/SEP20TST/viewsolution/36615800/ - we exceed the discuss post size limit if we include it inline!**

Tester's Solution (C++)

    #include <bits/stdc++.h>
     
    #include <ext/pb_ds/assoc_container.hpp> // Common file 
    #include <ext/pb_ds/tree_policy.hpp> 
      
    typedef __gnu_pbds::tree<long long, __gnu_pbds::null_type, std::less<long long>, __gnu_pbds::rb_tree_tag, 
                 __gnu_pbds::tree_order_statistics_node_update> 
        new_tree; 
     
    namespace Input {
        const int BUFFER_SIZE = int(1.1e5);
     
        char _buf[BUFFER_SIZE + 10];
        int _buf_pos, _buf_len;
     
        char seekChar() {
            if(_buf_pos >= _buf_len) {
                _buf_len = fread(_buf, 1, BUFFER_SIZE, stdin);
                _buf_pos = 0;
            }
            assert(_buf_pos < _buf_len);
            return _buf[_buf_pos];
        }
     
        bool seekEof() {
            if(_buf_pos >= _buf_len) {
                _buf_len = fread(_buf, 1, BUFFER_SIZE, stdin);
                _buf_pos = 0;
            }
            return _buf_pos >= _buf_len;
        }
     
        char readChar() {
            char ret = seekChar();
            _buf_pos++;
            return ret;
        }
     
        long long readLong(long long lb, long long rb) {
            char c = readChar();
            long long mul = 1;
            if(c == '-') {
                c = readChar();
                mul = -1;
            }
            assert(isdigit(c));
     
            long long ret = c - '0';
            char first_digit = c;
            int len = 0;
            while(!seekEof() && isdigit(seekChar()) && ++len <= 19) {
                ret = ret * 10 + readChar() - '0';
            }
            ret *= mul;
     
            if(len >= 2) assert(first_digit != '0');
            assert(len <= 18);
            assert(lb <= ret && ret <= rb);
            return ret;
        }
     
        int readInt(int lb, int rb) {
            return readLong(lb, rb);
        }
     
        void readEoln() {
            char c = readChar();
            assert(c == '\n');
            //assert(c == '\n' || (c == '\r' && readChar() == '\n'));
        }
     
        void readSpace() {
            char c = readChar();
            assert(c == ' ');
        }
    }
    using namespace Input;
     
    const int MAX_N = 200000;
    const int MAX_SUM_N = 200000;
    const int MAX_Q = 200000;
    const int MAX_SUM_Q = 200000;
     
    class DisjointSet {
        std::vector<int> par;
    public:
        DisjointSet(int n): par(n+1) {
            for(int i = 1; i <= n; i++) par[i] = i;
        }
     
        int get(int x) {
            return par[x] == x ? x : (par[x] = get(par[x]));
        }
     
        bool merge(int x, int y) {
            x = get(x);
            y = get(y);
            par[x] = y;
            return x != y;
        }
    };
     
    namespace PersistentSegmentTree {
        struct node {
            int sum, left, right, ts;
        };
     
        node tree[int(1.2e7)]; int NUM_NODES;
        int TIME;
     
        node& get_node(int x) {
            node& ret = tree[x];
            if(ret.ts < TIME) {
                ret = {0, 0, 0, TIME};
            }
            return ret;
        }
     
        node& new_node() {
            return get_node(++NUM_NODES);
        }
     
        int insert(int ni, int nl, int nr, int x) {
            node &nd = get_node(ni);
            int ret = ++NUM_NODES;
            node &new_node = get_node(ret);
     
            new_node = nd;
            if(nl == nr) {
                new_node.sum += 1;
                return ret;
            }
     
            int nm = (nl + nr) >> 1;
            if(x <= nm) {
                new_node.left = insert(nd.left, nl, nm, x);
            }else {
                new_node.right = insert(nd.right, nm+1, nr, x);
            }
     
            new_node.sum = tree[new_node.left].sum + tree[new_node.right].sum;
            return ret;
        }
     
        int get_sum (int ni, int nl, int nr, int x, int y) {
            if(ni == 0 || x > y) return 0;
     
            node &nd = get_node(ni);
            if(nl == x && nr == y) {
                return nd.sum;
            }
            
            int nm = (nl + nr) >> 1;
            int ret = 0;
            if(x <= nm) {
                ret += get_sum(nd.left, nl, nm, x, std::min(y, nm));
            }
            if(nm < y) {
                ret += get_sum(nd.right, nm+1, nr, std::max(x, nm+1), y);
            }
            return ret;
        }
     
        int get_kth(int nl, int nr, std::vector<int> nodes, std::vector<int> coefs, long long &k) {
            const int NUM_CONSIDERING_NODES = nodes.size();
            while(nl < nr) {
                int nm = (nl + nr) >> 1;
     
                int leftsum = 0;
                for (int i = 0; i < NUM_CONSIDERING_NODES; i++) {
                    if (!nodes[i]) continue;
                    auto &nd = get_node(nodes[i]);
                    if (!nd.left) continue;
                    auto &ndl = get_node(nd.left);
                    leftsum += ndl.sum * coefs[i];
                }
     
                if (k < leftsum) {
                    for (int i = 0; i < NUM_CONSIDERING_NODES; i++) {
                        if (!nodes[i]) continue;
                        nodes[i] = get_node(nodes[i]).left;
                    }
                    nr = nm;
                } else {
                    k -= leftsum;
                    for (int i = 0; i < NUM_CONSIDERING_NODES; i++) {
                        if (!nodes[i]) continue;
                        nodes[i] = get_node(nodes[i]).right;
                    }
                    nl = nm+1;
                }
            }
            return nl;
        }
     
        void clear() {
            TIME++;
            NUM_NODES = 0;
        }
    }
     
     
    int sumN = 0, sumQ = 0;
     
    std::vector<int> gph[MAX_N+5];
    int depth[MAX_N+5];
    int subtreeSize[MAX_N+5];
    long long candidatePrefixSum[MAX_N+5];
     
    std::vector<int> ord;
    std::vector<int> nodes_with_depth[MAX_N+5], roots_with_depth[MAX_N+5];
     
    int L[MAX_N+5], R[MAX_N+5];
     
    int dfs (int u, int p) {
        depth[u] = depth[p] + 1;
        ord.push_back(u);
        nodes_with_depth[depth[u]].push_back(u);
        L[u] = ord.size();
        for (int v : gph[u]) if(v != p) subtreeSize[u] += dfs(v, u);
        R[u] = ord.size();
        return ++subtreeSize[u];
    }
     
    int roots[MAX_N+5];
     
    void run() {
        int N = readInt(1, MAX_N);
        readEoln();
        sumN += N;
        assert(sumN <= MAX_SUM_N);
     
        DisjointSet ds(N);
        for (int e = 0; e < N-1; e++) {
            int u = readInt(1, N);
            readSpace();
            int v = readInt(1, N);
            readEoln();
            
            bool merged = ds.merge(u, v);
            assert(merged);
     
            gph[u].push_back(v);
            gph[v].push_back(u);
        }
        
        ord.reserve(N);
        dfs(1, 0);
        for (int i = 1; i <= N; i++) {
            candidatePrefixSum[i] = candidatePrefixSum[i-1] + (N - subtreeSize[i]);
        }
     
        PersistentSegmentTree::clear();
        for (int i = 1; i <= N; i++) {
            roots[i] = PersistentSegmentTree::insert(roots[i-1], 0, N, depth[ord[i-1]]);
        }
        for (int d = 0; d <= N; d++) if(!nodes_with_depth[d].empty()) {
            PersistentSegmentTree::new_node();
            roots_with_depth[d].push_back(PersistentSegmentTree::NUM_NODES);
            for (int u : nodes_with_depth[d]) {
                int new_root = PersistentSegmentTree::insert(roots_with_depth[d].back(), 1, N, u);
                roots_with_depth[d].push_back(new_root);
            }
        }
     
        int Q = readInt(1, MAX_Q);
        readEoln();
        sumQ += Q;
        assert(sumQ <= MAX_SUM_Q);
        
        new_tree usedPositions;
     
        const int MOD = 1'000'000'007;
        long long decryptionKey = 0;
        for (long long pow2 = 1, pow3 = 1, qid = 0; qid < Q; qid++) {
            (pow2 *= 2) %= MOD;
            (pow3 *= 3) %= MOD;
     
            long long encryptedChoice = readLong(0, (1ll << 36));
            readEoln();
     
            long long decryptedChoice = (encryptedChoice ^ decryptionKey) - 1;
            
            long long target = -1;
            {
                long long low = decryptedChoice;
                long long high = std::min(decryptedChoice + qid, candidatePrefixSum[N]);
                while (low <= high) {
                    long long mid = (low + high) / 2;
                    long long currentPosition = mid - usedPositions.order_of_key(mid+1);
                    if (currentPosition < decryptedChoice) {
                        low = mid+1;
                    } else {
                        target = mid;
                        high = mid-1;
                    }
                }
     
                if (target < 0) {
                    printf("qid = %d, dc = %lld %lld\n", qid, decryptedChoice, candidatePrefixSum[N]);
                }
     
                assert(target >= 0);
                usedPositions.insert(target);
            }
            
            // find target-th position from list
     
            // candidatePrefixSum[u-1] <= target < candidatePrefixSum[u]
            int u = std::upper_bound(candidatePrefixSum, candidatePrefixSum + N + 1, target) - candidatePrefixSum;
            target -= candidatePrefixSum[u-1];
     
            int d = PersistentSegmentTree::get_kth(
                0, N, 
                {roots[N], roots[R[u]], roots[L[u]-1]},
                {+1, -1, +1},
                target
            );
     
            int pl = std::distance(
                nodes_with_depth[d].begin(),
                std::lower_bound(
                    nodes_with_depth[d].begin(), nodes_with_depth[d].end(), u,
                    [&](const int &a, const int &b) { return L[a] < L[b]; }
                )
            );
     
            int pr = std::distance(
                nodes_with_depth[d].begin(),
                std::upper_bound(
                    nodes_with_depth[d].begin(), nodes_with_depth[d].end(), ord[R[u]-1],
                    [&](const int &a, const int &b) { return L[a] < L[b]; }
                )
            );
            int v = PersistentSegmentTree::get_kth(
                1, N,
                {roots_with_depth[d].back(), roots_with_depth[d][pr], roots_with_depth[d][pl]},
                {+1, -1, +1},
                target
            );
     
            assert(target == 0);
            assert(v > 0);
            assert(depth[v] == d);
     
            (decryptionKey += pow2 * u + pow3 * v) %= MOD;
        }
        printf("%lld\n", decryptionKey);
        // initialize global vars
        ord.clear();
        for (int u = 0; u <= N; u++) {
            gph[u].clear();
            depth[u] = 0;
            subtreeSize[u] = 0;
            L[u] = R[u] = 0;
            candidatePrefixSum[u] = 0;
            nodes_with_depth[u].clear();
            roots_with_depth[u].clear();
        }
    }
     
    int main() {
    #ifdef IN_MY_COMPUTER
        freopen("example.in", "r", stdin);
    #endif
     
        int T = readInt(1, 1000);
        readEoln();
        while(T--) {
            run();
        }
        assert(feof(stdin));
        return 0;
    }